Evaluating Probabilities Under High-Dimensional Latent Variable Models

Abstract

We present a simple new Monte Carlo algorithm for evaluating probabilities of observations in complex latent variable models, such as Deep Belief Networks. While the method is based on Markov chains, estimates based on short runs are formally unbiased. In expectation, the log probability of a test set will be underestimated, and this could form the basis of a probabilistic bound. The method is much cheaper than gold-standard annealing-based methods and only slightly more expensive than the cheapest Monte Carlo methods. We give examples of the new method substantially improving simple variational bounds at modest extra cost.

Cite

Text

Murray and Salakhutdinov. "Evaluating Probabilities Under High-Dimensional Latent Variable Models." Neural Information Processing Systems, 2008.

Markdown

[Murray and Salakhutdinov. "Evaluating Probabilities Under High-Dimensional Latent Variable Models." Neural Information Processing Systems, 2008.](https://mlanthology.org/neurips/2008/murray2008neurips-evaluating/)

BibTeX

@inproceedings{murray2008neurips-evaluating,
  title     = {{Evaluating Probabilities Under High-Dimensional Latent Variable Models}},
  author    = {Murray, Iain and Salakhutdinov, Ruslan},
  booktitle = {Neural Information Processing Systems},
  year      = {2008},
  pages     = {1137-1144},
  url       = {https://mlanthology.org/neurips/2008/murray2008neurips-evaluating/}
}