Particle Gibbs for Infinite Hidden Markov Models
Abstract
Infinite Hidden Markov Models (iHMM's) are an attractive, nonparametric generalization of the classical Hidden Markov Model which can automatically infer the number of hidden states in the system. However, due to the infinite-dimensional nature of the transition dynamics, performing inference in the iHMM is difficult. In this paper, we present an infinite-state Particle Gibbs (PG) algorithm to resample state trajectories for the iHMM. The proposed algorithm uses an efficient proposal optimized for iHMMs, and leverages ancestor sampling to improve the mixing of the standard PG algorithm. Our algorithm demonstrates significant convergence improvements on synthetic and real world data sets.
Cite
Text
Tripuraneni et al. "Particle Gibbs for Infinite Hidden Markov Models." Neural Information Processing Systems, 2015.Markdown
[Tripuraneni et al. "Particle Gibbs for Infinite Hidden Markov Models." Neural Information Processing Systems, 2015.](https://mlanthology.org/neurips/2015/tripuraneni2015neurips-particle/)BibTeX
@inproceedings{tripuraneni2015neurips-particle,
title = {{Particle Gibbs for Infinite Hidden Markov Models}},
author = {Tripuraneni, Nilesh and Gu, Shixiang and Ge, Hong and Ghahramani, Zoubin},
booktitle = {Neural Information Processing Systems},
year = {2015},
pages = {2395-2403},
url = {https://mlanthology.org/neurips/2015/tripuraneni2015neurips-particle/}
}