Granularity and Elasticity Adaptation in Visual Tracking
Abstract
The observation models in tracking algorithms are critical to both tracking performance and applicable scenarios but are often simplified to focus on fixed level of certain target properties such as appearances and structures. In this paper, we propose a unified tracking paradigm in which targets are represented by Markov random fields of interest regions and introduce a new way to adapt observation models by automatically tuning the feature granularity and model elasticity, i.e. the abstraction level of features and the model's degree of flexibility to tolerate deformations. Specifically, we employ a multi-scale scheme to extract features from interest regions and adjust the parameters of the potential functions of the MRF model to maximize the likelihoods of tracking results. Experiments demonstrate the method can estimate translation, scaling and rotation and deal with deformation, partial occlusions, and camouflage objects within this unified framework.
Cite
Text
Yang and Wu. "Granularity and Elasticity Adaptation in Visual Tracking." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2008. doi:10.1109/CVPR.2008.4587550Markdown
[Yang and Wu. "Granularity and Elasticity Adaptation in Visual Tracking." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2008.](https://mlanthology.org/cvpr/2008/yang2008cvpr-granularity/) doi:10.1109/CVPR.2008.4587550BibTeX
@inproceedings{yang2008cvpr-granularity,
title = {{Granularity and Elasticity Adaptation in Visual Tracking}},
author = {Yang, Ming and Wu, Ying},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year = {2008},
doi = {10.1109/CVPR.2008.4587550},
url = {https://mlanthology.org/cvpr/2008/yang2008cvpr-granularity/}
}