An Application of Tree-Structured Expectation Propagation for Channel Decoding

Abstract

We show an application of a tree structure for approximate inference in graphical models using the expectation propagation algorithm. These approximations are typically used over graphs with short-range cycles. We demonstrate that these approximations also help in sparse graphs with long-range loops, as the ones used in coding theory to approach channel capacity. For asymptotically large sparse graph, the expectation propagation algorithm together with the tree structure yields a completely disconnected approximation to the graphical model but, for for finite-length practical sparse graphs, the tree structure approximation to the code graph provides accurate estimates for the marginal of each variable.

Cite

Text

Olmos et al. "An Application of Tree-Structured Expectation Propagation for Channel Decoding." Neural Information Processing Systems, 2011.

Markdown

[Olmos et al. "An Application of Tree-Structured Expectation Propagation for Channel Decoding." Neural Information Processing Systems, 2011.](https://mlanthology.org/neurips/2011/olmos2011neurips-application/)

BibTeX

@inproceedings{olmos2011neurips-application,
  title     = {{An Application of Tree-Structured Expectation Propagation for Channel Decoding}},
  author    = {Olmos, Pablo M. and Salamanca, Luis and Fuentes, Juan and Pérez-Cruz, Fernando},
  booktitle = {Neural Information Processing Systems},
  year      = {2011},
  pages     = {1854-1862},
  url       = {https://mlanthology.org/neurips/2011/olmos2011neurips-application/}
}