Propagation Algorithms for Variational Bayesian Learning
Abstract
Variational approximations are becoming a widespread tool for Bayesian learning of graphical models. We provide some theoret(cid:173) ical results for the variational updates in a very general family of conjugate-exponential graphical models. We show how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian learning. Applying these re(cid:173) sults to the Bayesian analysis of linear-Gaussian state-space models we obtain a learning procedure that exploits the Kalman smooth(cid:173) ing propagation, while integrating over all model parameters. We demonstrate how this can be used to infer the hidden state dimen(cid:173) sionality of the state-space model in a variety of synthetic problems and one real high-dimensional data set.
Cite
Text
Ghahramani and Beal. "Propagation Algorithms for Variational Bayesian Learning." Neural Information Processing Systems, 2000.Markdown
[Ghahramani and Beal. "Propagation Algorithms for Variational Bayesian Learning." Neural Information Processing Systems, 2000.](https://mlanthology.org/neurips/2000/ghahramani2000neurips-propagation/)BibTeX
@inproceedings{ghahramani2000neurips-propagation,
title = {{Propagation Algorithms for Variational Bayesian Learning}},
author = {Ghahramani, Zoubin and Beal, Matthew J.},
booktitle = {Neural Information Processing Systems},
year = {2000},
pages = {507-513},
url = {https://mlanthology.org/neurips/2000/ghahramani2000neurips-propagation/}
}