Tree-Structured Approximations by Expectation Propagation
Abstract
Approximation structure plays an important role in inference on loopy graphs. As a tractable structure, tree approximations have been utilized in the variational method of Ghahramani & Jordan (1997) and the se- quential projection method of Frey et al. (2000). However, belief propa- gation represents each factor of the graph with a product of single-node messages. In this paper, belief propagation is extended to represent fac- tors with tree approximations, by way of the expectation propagation framework. That is, each factor sends a “message” to all pairs of nodes in a tree structure. The result is more accurate inferences and more fre- quent convergence than ordinary belief propagation, at a lower cost than variational trees or double-loop algorithms.
Cite
Text
Qi and Minka. "Tree-Structured Approximations by Expectation Propagation." Neural Information Processing Systems, 2003.Markdown
[Qi and Minka. "Tree-Structured Approximations by Expectation Propagation." Neural Information Processing Systems, 2003.](https://mlanthology.org/neurips/2003/qi2003neurips-treestructured/)BibTeX
@inproceedings{qi2003neurips-treestructured,
title = {{Tree-Structured Approximations by Expectation Propagation}},
author = {Qi, Yuan and Minka, Tom},
booktitle = {Neural Information Processing Systems},
year = {2003},
pages = {193-200},
url = {https://mlanthology.org/neurips/2003/qi2003neurips-treestructured/}
}