Message-Passing for Graph-Structured Linear Programs: Proximal Projections, Convergence and Rounding Schemes

Abstract

Linear programming relaxations are one promising approach to solving the MAP estimation problem in Markov random fields; in particular, a body of past work has focused on the first-order tree-based LP relaxation for the MAP problem. Although a variety of algorithms with interesting connections to this LP have been proposed, to date none is guaranteed to always solve the LP for any problem. In this paper, we develop a family of provably convergent LP solvers based on proximal minimization schemes using Bregman divergences that exploit the underlying graphical structure, and so scale well to large problems. All of our algorithms have a double-loop character, with the outer loop corresponding to the proximal sequence, and an inner loop of cyclic Bregman divergences used to compute each proximal update. The inner loop updates are distributed and respect the graph structure, and thus can be cast as message-passing algorithms. We establish various convergence guarantees for our algorithms, illustrate their performance on medium to large-scale problems, and also present a tree-based rounding scheme with provable optimality guarantees.

Cite

Text

Ravikumar et al. "Message-Passing for Graph-Structured Linear Programs: Proximal Projections, Convergence and Rounding Schemes." International Conference on Machine Learning, 2008. doi:10.1145/1390156.1390257

Markdown

[Ravikumar et al. "Message-Passing for Graph-Structured Linear Programs: Proximal Projections, Convergence and Rounding Schemes." International Conference on Machine Learning, 2008.](https://mlanthology.org/icml/2008/ravikumar2008icml-message/) doi:10.1145/1390156.1390257

BibTeX

@inproceedings{ravikumar2008icml-message,
  title     = {{Message-Passing for Graph-Structured Linear Programs: Proximal Projections, Convergence and Rounding Schemes}},
  author    = {Ravikumar, Pradeep and Agarwal, Alekh and Wainwright, Martin J.},
  booktitle = {International Conference on Machine Learning},
  year      = {2008},
  pages     = {800-807},
  doi       = {10.1145/1390156.1390257},
  url       = {https://mlanthology.org/icml/2008/ravikumar2008icml-message/}
}