Stochastic Variational Inference for the HDP-HMM

Abstract

We derive a variational inference algorithm for the HDP-HMM based on the two-level stick breaking construction. This construction has previously been applied to the hierarchical Dirichlet processes (HDP) for mixed membership models, allowing for efficient handling of the coupled weight parameters. However, the same algorithm is not directly applicable to HDP-based infinite hidden Markov models (HDP-HMM) because of extra sequential dependencies in the Markov chain. In this paper we provide a solution to this problem by deriving a variational inference algorithm for the HDP-HMM, as well as its stochastic extension, for which all parameter updates are in closed form. We apply our algorithm to sequential text analysis and audio signal analysis, comparing our results with the beam-sampled iHMM, the parametric HMM, and other variational inference approximations.

Cite

Text

Zhang et al. "Stochastic Variational Inference for the HDP-HMM." International Conference on Artificial Intelligence and Statistics, 2016.

Markdown

[Zhang et al. "Stochastic Variational Inference for the HDP-HMM." International Conference on Artificial Intelligence and Statistics, 2016.](https://mlanthology.org/aistats/2016/zhang2016aistats-stochastic/)

BibTeX

@inproceedings{zhang2016aistats-stochastic,
  title     = {{Stochastic Variational Inference for the HDP-HMM}},
  author    = {Zhang, Aonan and Gultekin, San and Paisley, John W.},
  booktitle = {International Conference on Artificial Intelligence and Statistics},
  year      = {2016},
  pages     = {800-808},
  url       = {https://mlanthology.org/aistats/2016/zhang2016aistats-stochastic/}
}