Convergence of Some Convex Message Passing Algorithms to a Fixed Point
Abstract
A popular approach to the MAP inference problem in graphical models is to minimize an upper bound obtained from a dual linear programming or Lagrangian relaxation by (block-)coordinate descent. This is also known as convex/convergent message passing; examples are max-sum diffusion and sequential tree-reweighted message passing (TRW-S). Convergence properties of these methods are currently not fully understood. They have been proved to converge to the set characterized by local consistency of active constraints, with unknown convergence rate; however, it was not clear if the iterates converge at all (to any point). We prove a stronger result (conjectured before but never proved): the iterates converge to a fixed point of the method. Moreover, we show that the algorithm terminates within $\mathcal{O}(1/\varepsilon)$ iterations. We first prove this for a version of coordinate descent applied to a general piecewise-affine convex objective. Then we show that several convex message passing methods are special cases of this method. Finally, we show that a slightly different version of coordinate descent can cycle.
Cite
Text
Voracek and Werner. "Convergence of Some Convex Message Passing Algorithms to a Fixed Point." International Conference on Machine Learning, 2024.Markdown
[Voracek and Werner. "Convergence of Some Convex Message Passing Algorithms to a Fixed Point." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/voracek2024icml-convergence/)BibTeX
@inproceedings{voracek2024icml-convergence,
title = {{Convergence of Some Convex Message Passing Algorithms to a Fixed Point}},
author = {Voracek, Vaclav and Werner, Tomas},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {49688-49697},
volume = {235},
url = {https://mlanthology.org/icml/2024/voracek2024icml-convergence/}
}