The Infinite Factorial Hidden Markov Model

Abstract

We introduces a new probability distribution over a potentially infinite number of binary Markov chains which we call the Markov Indian buffet process. This process extends the IBP to allow temporal dependencies in the hidden variables. We use this stochastic process to build a nonparametric extension of the factorial hidden Markov model. After working out an inference scheme which combines slice sampling and dynamic programming we demonstrate how the infinite factorial hidden Markov model can be used for blind source separation.

Cite

Text

Gael et al. "The Infinite Factorial Hidden Markov Model." Neural Information Processing Systems, 2008.

Markdown

[Gael et al. "The Infinite Factorial Hidden Markov Model." Neural Information Processing Systems, 2008.](https://mlanthology.org/neurips/2008/gael2008neurips-infinite/)

BibTeX

@inproceedings{gael2008neurips-infinite,
  title     = {{The Infinite Factorial Hidden Markov Model}},
  author    = {Gael, Jurgen V. and Teh, Yee W. and Ghahramani, Zoubin},
  booktitle = {Neural Information Processing Systems},
  year      = {2008},
  pages     = {1697-1704},
  url       = {https://mlanthology.org/neurips/2008/gael2008neurips-infinite/}
}