Infinite Hierarchical Hidden Markov Models

Abstract

In this paper we present the Infinite Hierarchical Hidden Markov Model (IHHMM), a nonparametric generalization of Hierarchical Hidden Markov Models (HHMMs). HHMMs have been used for modeling sequential data in applications such as speech recognition, detecting topic transitions in video and extracting information from text. The IHHMM provides more flexible modeling of sequential data by allowing a potentially unbounded number of levels in the hierarchy, instead of requiring the specification of a fixed hierarchy depth. Inference and learning are performed efficiently using Gibbs sampling and a modified forward-backtrack algorithm. We show encouraging demonstrations of the workings of the IHHMM.

Cite

Text

Heller et al. "Infinite Hierarchical Hidden Markov Models." Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, 2009.

Markdown

[Heller et al. "Infinite Hierarchical Hidden Markov Models." Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, 2009.](https://mlanthology.org/aistats/2009/heller2009aistats-infinite/)

BibTeX

@inproceedings{heller2009aistats-infinite,
  title     = {{Infinite Hierarchical Hidden Markov Models}},
  author    = {Heller, Katherine and Teh, Yee Whye and Gorur, Dilan},
  booktitle = {Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics},
  year      = {2009},
  pages     = {224-231},
  volume    = {5},
  url       = {https://mlanthology.org/aistats/2009/heller2009aistats-infinite/}
}