Variational Methods for the Dirichlet Process

Abstract

Variational inference methods, including mean field methods and loopy beliefpropagation, have been widely used for approximate probabilistic inference ingraphical models. While often less accurate than MCMC, variational methodsprovide a fast deterministic approximation to marginal and conditionalprobabilities. Such approximations can be particularly useful in highdimensional problems where sampling methods are too slow to be effective. Alimitation of current methods, however, is that they are restricted toparametric probabilistic models. MCMC does not have such a limitation;indeed, MCMC samplers have been developed for the Dirichlet process (DP), anonparametric distribution on distributions (Ferguson, 1973) that is thecornerstone of Bayesian nonparametric statistics~\citep (Escobar and West,1995; Neal, 2000}. In this paper, we develop a mean-field variationalapproach to approximate inference for the Dirichlet process, where theapproximate posterior is based on the truncated stick-breaking construction(Ishwaran and James, 2001). We compare our approach to DP samplers for Gaussian DP mixture models.

Cite

Text

Blei and Jordan. "Variational Methods for the Dirichlet Process." International Conference on Machine Learning, 2004. doi:10.1145/1015330.1015439

Markdown

[Blei and Jordan. "Variational Methods for the Dirichlet Process." International Conference on Machine Learning, 2004.](https://mlanthology.org/icml/2004/blei2004icml-variational/) doi:10.1145/1015330.1015439

BibTeX

@inproceedings{blei2004icml-variational,
  title     = {{Variational Methods for the Dirichlet Process}},
  author    = {Blei, David M. and Jordan, Michael I.},
  booktitle = {International Conference on Machine Learning},
  year      = {2004},
  doi       = {10.1145/1015330.1015439},
  url       = {https://mlanthology.org/icml/2004/blei2004icml-variational/}
}