Particle Video: Long-Range Motion Estimation Using Point Trajectories
Abstract
This paper describes a new approach to motion estimation in video. We represent video motion using a set of particles. Each particle is an image point sample with a longduration trajectory and other properties. To optimize these particles, we measure point-based matching along the particle trajectories and distortion between the particles. The resulting motion representation is useful for a variety of applications and cannot be directly obtained using existing methods such as optical flow or feature tracking. We demonstrate the algorithm on challenging real-world videos that include complex scene geometry, multiple types of occlusion, regions with low texture, and non-rigid deformations.
Cite
Text
Sand and Teller. "Particle Video: Long-Range Motion Estimation Using Point Trajectories." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2006. doi:10.1109/CVPR.2006.219Markdown
[Sand and Teller. "Particle Video: Long-Range Motion Estimation Using Point Trajectories." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2006.](https://mlanthology.org/cvpr/2006/sand2006cvpr-particle/) doi:10.1109/CVPR.2006.219BibTeX
@inproceedings{sand2006cvpr-particle,
title = {{Particle Video: Long-Range Motion Estimation Using Point Trajectories}},
author = {Sand, Peter and Teller, Seth J.},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year = {2006},
pages = {2195-2202},
doi = {10.1109/CVPR.2006.219},
url = {https://mlanthology.org/cvpr/2006/sand2006cvpr-particle/}
}