A Bayesian Framework for Multi-Cue 3D Object Tracking

Abstract

This paper presents a Bayesian framework for multi-cue 3D object tracking of deformable objects. The proposed spatio-temporal object representation involves a set of distinct linear subspace models or Dynamic Point Distribution Models (DPDMs), which can deal with both continuous and discontinuous appearance changes; the representation is learned fully automatically from training data. The representation is enriched with texture information by means of intensity histograms, which are compared using the Bhattacharyya coefficient. Direct 3D measurement is furthermore provided by a stereo system. State propagation is achieved by a particle filter which combines the three cues shape, texture and depth, in its observation density function. The tracking framework integrates an independently operating object detection system by means of importance sampling. We illustrate the benefit of our integrated multi-cue tracking approach on pedestrian tracking from a moving vehicle.

Cite

Text

Giebel et al. "A Bayesian Framework for Multi-Cue 3D Object Tracking." European Conference on Computer Vision, 2004. doi:10.1007/978-3-540-24673-2_20

Markdown

[Giebel et al. "A Bayesian Framework for Multi-Cue 3D Object Tracking." European Conference on Computer Vision, 2004.](https://mlanthology.org/eccv/2004/giebel2004eccv-bayesian/) doi:10.1007/978-3-540-24673-2_20

BibTeX

@inproceedings{giebel2004eccv-bayesian,
  title     = {{A Bayesian Framework for Multi-Cue 3D Object Tracking}},
  author    = {Giebel, Jan and Gavrila, Dariu and Schnörr, Christoph},
  booktitle = {European Conference on Computer Vision},
  year      = {2004},
  pages     = {241-252},
  doi       = {10.1007/978-3-540-24673-2_20},
  url       = {https://mlanthology.org/eccv/2004/giebel2004eccv-bayesian/}
}