Trajectory Matching from Unsynchronized Videos
Abstract
This paper studies the problem of spatio-temporal matching between trajectories from two videos of the same scene. In real applications, trajectories are usually extracted independently in different videos. So possibly a lot of trajectories stay “alone” (have no corresponding trajectory in the other video). In this paper, we propose a novel matching algorithm which can not only find the existing correspondences between trajectories, but also recover the corresponding trajectories of “alone” ones. First, we cast trajectory matching problem as an element recovering problem from a matrix constructed by matched trajectories of the two videos, which is naturally incomplete. Then, under affine camera assumption, we recover the matrix by sparse representation and ℓ1 regularization techniques. Finally, the results are refined to the case of perspective projection by a local depths estimation procedure. Our algorithm can handle noisy, incomplete or outlying data. Experiments on both synthetic data and real videos show that the proposed method has good performance.
Cite
Text
Hu and Zhou. "Trajectory Matching from Unsynchronized Videos." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2010. doi:10.1109/CVPR.2010.5539811Markdown
[Hu and Zhou. "Trajectory Matching from Unsynchronized Videos." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2010.](https://mlanthology.org/cvpr/2010/hu2010cvpr-trajectory/) doi:10.1109/CVPR.2010.5539811BibTeX
@inproceedings{hu2010cvpr-trajectory,
title = {{Trajectory Matching from Unsynchronized Videos}},
author = {Hu, Han and Zhou, Jie},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year = {2010},
pages = {1347-1354},
doi = {10.1109/CVPR.2010.5539811},
url = {https://mlanthology.org/cvpr/2010/hu2010cvpr-trajectory/}
}