Structure Inference for Bayesian Multisensory Perception and Tracking

Abstract

We investigate a solution to the problem of multi-sensor perception and tracking by formulating it in the framework of Bayesian model selection. Humans robustly associate multi-sensory data as appropriate, but previous theoretical work has focused largely on purely integrative cases, leaving segregation unaccounted for and unexploited by machine perception systems. We illustrate a unifying, Bayesian solution to multi-sensor perception and tracking which accounts for both integration and segregation by explicit probabilistic reasoning about data association in a temporal context. Unsupervised learning of such a model with EM is illustrated for a real world audio-visual application.

Cite

Text

Hospedales et al. "Structure Inference for Bayesian Multisensory Perception and Tracking." International Joint Conference on Artificial Intelligence, 2007.

Markdown

[Hospedales et al. "Structure Inference for Bayesian Multisensory Perception and Tracking." International Joint Conference on Artificial Intelligence, 2007.](https://mlanthology.org/ijcai/2007/hospedales2007ijcai-structure/)

BibTeX

@inproceedings{hospedales2007ijcai-structure,
  title     = {{Structure Inference for Bayesian Multisensory Perception and Tracking}},
  author    = {Hospedales, Timothy M. and Cartwright, Joel J. and Vijayakumar, Sethu},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2007},
  pages     = {2122-2128},
  url       = {https://mlanthology.org/ijcai/2007/hospedales2007ijcai-structure/}
}