Collapsed Variational Inference for HDP

Abstract

A wide variety of Dirichlet-multinomial ‘topic’ models have found interesting ap- plications in recent years. While Gibbs sampling remains an important method of inference in such models, variational techniques have certain advantages such as easy assessment of convergence, easy optimization without the need to maintain detailed balance, a bound on the marginal likelihood, and side-stepping of issues with topic-identifiability. The most accurate variational technique thus far, namely collapsed variational latent Dirichlet allocation, did not deal with model selection nor did it include inference for hyperparameters. We address both issues by gen- eralizing the technique, obtaining the first variational algorithm to deal with the hierarchical Dirichlet process and to deal with hyperparameters of Dirichlet vari- ables. Experiments show a significant improvement in accuracy.

Cite

Text

Teh et al. "Collapsed Variational Inference for HDP." Neural Information Processing Systems, 2007.

Markdown

[Teh et al. "Collapsed Variational Inference for HDP." Neural Information Processing Systems, 2007.](https://mlanthology.org/neurips/2007/teh2007neurips-collapsed/)

BibTeX

@inproceedings{teh2007neurips-collapsed,
  title     = {{Collapsed Variational Inference for HDP}},
  author    = {Teh, Yee W. and Kurihara, Kenichi and Welling, Max},
  booktitle = {Neural Information Processing Systems},
  year      = {2007},
  pages     = {1481-1488},
  url       = {https://mlanthology.org/neurips/2007/teh2007neurips-collapsed/}
}