Generalized Belief Propagation for Approximate Inference in Hybrid Bayesian Networks
Abstract
We apply generalized belief propagation to approximate inference in hybrid Bayesian networks. In essence, in the algorithms developed for discrete networks we only have to change "strong marginalization" (exact) into "weak marginalization" (same moments) or, equivalently, the "sum" operation in the (generalized) sum-product algorithm into a "collapse" operation. We describe both a message-free single-loop algorithm based on fixed-point iteration and a more tedious double-loop algorithm guaranteed to converge to a minimum of the Kikuchi free energy. With the cluster variation method we can interpolate between the minimal Kikuchi approximation and the (strong) junction tree algorithm. Simulations on the emission network of [7] , extended in [13], indicate that the Kikuchi approximation in practice often works really well, even in the difficult case of discrete children of continuous parents.
Cite
Text
Heskes and Zoeter. "Generalized Belief Propagation for Approximate Inference in Hybrid Bayesian Networks." Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics, 2003.Markdown
[Heskes and Zoeter. "Generalized Belief Propagation for Approximate Inference in Hybrid Bayesian Networks." Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics, 2003.](https://mlanthology.org/aistats/2003/heskes2003aistats-generalized/)BibTeX
@inproceedings{heskes2003aistats-generalized,
title = {{Generalized Belief Propagation for Approximate Inference in Hybrid Bayesian Networks}},
author = {Heskes, Tom and Zoeter, Onno},
booktitle = {Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics},
year = {2003},
pages = {132-140},
volume = {R4},
url = {https://mlanthology.org/aistats/2003/heskes2003aistats-generalized/}
}