Accumulator Networks: Suitors of Local Probability Propagation
Abstract
One way to approximate inference in richly-connected graphical models is to apply the sum-product algorithm (a.k.a. probabil(cid:173) ity propagation algorithm), while ignoring the fact that the graph has cycles. The sum-product algorithm can be directly applied in Gaussian networks and in graphs for coding, but for many condi(cid:173) tional probability functions - including the sigmoid function - di(cid:173) rect application of the sum-product algorithm is not possible. We introduce "accumulator networks" that have low local complexity (but exponential global complexity) so the sum-product algorithm can be directly applied. In an accumulator network, the probability of a child given its parents is computed by accumulating the inputs from the parents in a Markov chain or more generally a tree. After giving expressions for inference and learning in accumulator net(cid:173) works, we give results on the "bars problem" and on the problem of extracting translated, overlapping faces from an image.
Cite
Text
Frey and Kannan. "Accumulator Networks: Suitors of Local Probability Propagation." Neural Information Processing Systems, 2000.Markdown
[Frey and Kannan. "Accumulator Networks: Suitors of Local Probability Propagation." Neural Information Processing Systems, 2000.](https://mlanthology.org/neurips/2000/frey2000neurips-accumulator/)BibTeX
@inproceedings{frey2000neurips-accumulator,
title = {{Accumulator Networks: Suitors of Local Probability Propagation}},
author = {Frey, Brendan J. and Kannan, Anitha},
booktitle = {Neural Information Processing Systems},
year = {2000},
pages = {486-492},
url = {https://mlanthology.org/neurips/2000/frey2000neurips-accumulator/}
}