Probabilistic Tracking in a Metric Space
Abstract
A new exemplar-based, probabilistic paradigm for visual tracking is presented. Probabilistic mechanisms are attractive because they handle fusion of information, especially temporal fusion, in a principled manner. Exemplars are selected representatives of raw training data, used here to represent probabilistic mixture distributions of object configurations. Their use avoids tedious hand-construction of object models and problems with changes of topology. Using exemplars in place of a parameterized model poses several challenges, addressed here with what we call the "Metric Mixture" (M/sup 2/) approach. The M/sup 2/ model has several valuable properties. Principally, it provides alternatives to standard learning algorithms by allowing the use of metrics that are not embedded in a vector space. Secondly, it uses a noise model that is learned from training data. Lastly, it eliminates any need for an assumption of probabilistic pixelwise independence. Experiments demonstrate the effectiveness of the M/sup 2/ model in two domains tracking walking people using chamfer distances on binary edge images and tracking mouth movements by means of a shuffle distance.
Cite
Text
Toyama and Blake. "Probabilistic Tracking in a Metric Space." IEEE/CVF International Conference on Computer Vision, 2001. doi:10.1109/ICCV.2001.937599Markdown
[Toyama and Blake. "Probabilistic Tracking in a Metric Space." IEEE/CVF International Conference on Computer Vision, 2001.](https://mlanthology.org/iccv/2001/toyama2001iccv-probabilistic/) doi:10.1109/ICCV.2001.937599BibTeX
@inproceedings{toyama2001iccv-probabilistic,
title = {{Probabilistic Tracking in a Metric Space}},
author = {Toyama, Kentaro and Blake, Andrew},
booktitle = {IEEE/CVF International Conference on Computer Vision},
year = {2001},
pages = {50-59},
doi = {10.1109/ICCV.2001.937599},
url = {https://mlanthology.org/iccv/2001/toyama2001iccv-probabilistic/}
}