Appearance-Guided Particle Filtering for Articulated Hand Tracking
Abstract
We propose a model-based tracking method, called appearance-guided particle filtering (AGPF), which integrates both sequential motion transition information and appearance information. A probability propagation model is derived from a Bayesian formulation for this framework, and a sequential Monte Carlo method is introduced for its realization. We apply the proposed method to articulated hand tracking, and show that it performs better than methods that only use either sequential motion transition information or only use appearance information.
Cite
Text
Chang et al. "Appearance-Guided Particle Filtering for Articulated Hand Tracking." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2005. doi:10.1109/CVPR.2005.72Markdown
[Chang et al. "Appearance-Guided Particle Filtering for Articulated Hand Tracking." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2005.](https://mlanthology.org/cvpr/2005/chang2005cvpr-appearance/) doi:10.1109/CVPR.2005.72BibTeX
@inproceedings{chang2005cvpr-appearance,
title = {{Appearance-Guided Particle Filtering for Articulated Hand Tracking}},
author = {Chang, Wen-Yan and Chen, Chu-Song and Hung, Yi-Ping},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year = {2005},
pages = {235-242},
doi = {10.1109/CVPR.2005.72},
url = {https://mlanthology.org/cvpr/2005/chang2005cvpr-appearance/}
}