Loopy Belief Propagation: Convergence and Effects of Message Errors

Abstract

Belief propagation (BP) is an increasingly popular method of performing approximate inference on arbitrary graphical models. At times, even further approximations are required, whether due to quantization of the messages or model parameters, from other simplified message or model representations, or from stochastic approximation methods. The introduction of such errors into the BP message computations has the potential to affect the solution obtained adversely. We analyze the effect resulting from message approximation under two particular measures of error, and show bounds on the accumulation of errors in the system. This analysis leads to convergence conditions for traditional BP message passing, and both strict bounds and estimates of the resulting error in systems of approximate BP message passing.

Cite

Text

Ihler et al. "Loopy Belief Propagation: Convergence and Effects of Message Errors." Journal of Machine Learning Research, 2005.

Markdown

[Ihler et al. "Loopy Belief Propagation: Convergence and Effects of Message Errors." Journal of Machine Learning Research, 2005.](https://mlanthology.org/jmlr/2005/ihler2005jmlr-loopy/)

BibTeX

@article{ihler2005jmlr-loopy,
  title     = {{Loopy Belief Propagation: Convergence and Effects of Message Errors}},
  author    = {Ihler, Alexander T. and Iii, John W. Fisher and Willsky, Alan S.},
  journal   = {Journal of Machine Learning Research},
  year      = {2005},
  pages     = {905-936},
  volume    = {6},
  url       = {https://mlanthology.org/jmlr/2005/ihler2005jmlr-loopy/}
}