Accelerated Consensus via Min-Sum Splitting

Abstract

We apply the Min-Sum message-passing protocol to solve the consensus problem in distributed optimization. We show that while the ordinary Min-Sum algorithm does not converge, a modified version of it known as Splitting yields convergence to the problem solution. We prove that a proper choice of the tuning parameters allows Min-Sum Splitting to yield subdiffusive accelerated convergence rates, matching the rates obtained by shift-register methods. The acceleration scheme embodied by Min-Sum Splitting for the consensus problem bears similarities with lifted Markov chains techniques and with multi-step first order methods in convex optimization.

Cite

Text

Rebeschini and Tatikonda. "Accelerated Consensus via Min-Sum Splitting." Neural Information Processing Systems, 2017.

Markdown

[Rebeschini and Tatikonda. "Accelerated Consensus via Min-Sum Splitting." Neural Information Processing Systems, 2017.](https://mlanthology.org/neurips/2017/rebeschini2017neurips-accelerated/)

BibTeX

@inproceedings{rebeschini2017neurips-accelerated,
  title     = {{Accelerated Consensus via Min-Sum Splitting}},
  author    = {Rebeschini, Patrick and Tatikonda, Sekhar C},
  booktitle = {Neural Information Processing Systems},
  year      = {2017},
  pages     = {1374-1384},
  url       = {https://mlanthology.org/neurips/2017/rebeschini2017neurips-accelerated/}
}