Approximating Posterior Distributions in Belief Networks Using Mixtures
Abstract
Exact inference in densely connected Bayesian networks is computation(cid:173) ally intractable, and so there is considerable interest in developing effec(cid:173) tive approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is as(cid:173) sumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on mixtures of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learn(cid:173) ing in sigmoid belief networks. Our results demonstrate a systematic improvement over simple mean field theory as the number of mixture components is increased.
Cite
Text
Bishop et al. "Approximating Posterior Distributions in Belief Networks Using Mixtures." Neural Information Processing Systems, 1997.Markdown
[Bishop et al. "Approximating Posterior Distributions in Belief Networks Using Mixtures." Neural Information Processing Systems, 1997.](https://mlanthology.org/neurips/1997/bishop1997neurips-approximating/)BibTeX
@inproceedings{bishop1997neurips-approximating,
title = {{Approximating Posterior Distributions in Belief Networks Using Mixtures}},
author = {Bishop, Christopher M. and Lawrence, Neil D. and Jaakkola, Tommi and Jordan, Michael I.},
booktitle = {Neural Information Processing Systems},
year = {1997},
pages = {416-422},
url = {https://mlanthology.org/neurips/1997/bishop1997neurips-approximating/}
}