Locally Conditioned Belief Propagation
Abstract
Conditioned Belief Propagation (CBP) is an algorithm for approximate inference in probabilistic graphical models. It works by conditioning on a subset of variables, and solving the remainder using loopy Belief Propagation. Unfortunately, CBP's runtime scales exponentially in the number of conditioned variables. Locally Conditioned Belief Propagation (LCBP) approximates the results of CBP by treating conditions locally, and in this way avoids the exponential blow-up. We formulate LCBP as a variational optimization problem and derive a set of update equations that can be used to solve it. We show empirically that LCBP delivers results that are close to those obtained from CBP, while the computational cost scales favorably with problem size.
Cite
Text
Geier et al. "Locally Conditioned Belief Propagation." Conference on Uncertainty in Artificial Intelligence, 2015.Markdown
[Geier et al. "Locally Conditioned Belief Propagation." Conference on Uncertainty in Artificial Intelligence, 2015.](https://mlanthology.org/uai/2015/geier2015uai-locally/)BibTeX
@inproceedings{geier2015uai-locally,
title = {{Locally Conditioned Belief Propagation}},
author = {Geier, Thomas and Richter, Felix and Biundo, Susanne},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2015},
pages = {296-305},
url = {https://mlanthology.org/uai/2015/geier2015uai-locally/}
}