The Unified Propagation and Scaling Algorithm
Abstract
In this paper we will show that a restricted class of constrained mini- mum divergence problems, named generalized inference problems, can be solved by approximating the KL divergence with a Bethe free energy. The algorithm we derive is closely related to both loopy belief propaga- tion and iterative scaling. This unified propagation and scaling algorithm reduces to a convergent alternative to loopy belief propagation when no constraints are present. Experiments show the viability of our algorithm.
Cite
Text
Teh and Welling. "The Unified Propagation and Scaling Algorithm." Neural Information Processing Systems, 2001.Markdown
[Teh and Welling. "The Unified Propagation and Scaling Algorithm." Neural Information Processing Systems, 2001.](https://mlanthology.org/neurips/2001/teh2001neurips-unified/)BibTeX
@inproceedings{teh2001neurips-unified,
title = {{The Unified Propagation and Scaling Algorithm}},
author = {Teh, Yee W. and Welling, Max},
booktitle = {Neural Information Processing Systems},
year = {2001},
pages = {953-960},
url = {https://mlanthology.org/neurips/2001/teh2001neurips-unified/}
}