Linear Programming Relaxations and Belief Propagation -- an Empirical Study
Abstract
The problem of finding the most probable (MAP) configuration in graphical models comes up in a wide range of applications. In a general graphical model this problem is NP hard, but various approximate algorithms have been developed. Linear programming (LP) relaxations are a standard method in computer science for approximating combinatorial problems and have been used for finding the most probable assignment in small graphical models. However, applying this powerful method to real-world problems is extremely challenging due to the large numbers of variables and constraints in the linear program. Tree-Reweighted Belief Propagation is a promising recent algorithm for solving LP relaxations, but little is known about its running time on large problems.
Cite
Text
Yanover et al. "Linear Programming Relaxations and Belief Propagation -- an Empirical Study." Journal of Machine Learning Research, 2006.Markdown
[Yanover et al. "Linear Programming Relaxations and Belief Propagation -- an Empirical Study." Journal of Machine Learning Research, 2006.](https://mlanthology.org/jmlr/2006/yanover2006jmlr-linear/)BibTeX
@article{yanover2006jmlr-linear,
title = {{Linear Programming Relaxations and Belief Propagation -- an Empirical Study}},
author = {Yanover, Chen and Meltzer, Talya and Weiss, Yair},
journal = {Journal of Machine Learning Research},
year = {2006},
pages = {1887-1907},
volume = {7},
url = {https://mlanthology.org/jmlr/2006/yanover2006jmlr-linear/}
}