Convergence of Min-Sum-Min Message-Passing for Quadratic Optimization
Abstract
We propose a new message-passing algorithm for the quadratic optimization problem. As opposed to the min-sum algorithm, the new algorithm involves two minimizations and one summation at each iteration. The new min-sum-min algorithm exploits feedback from last iteration in generating new messages, resembling the Jacobi- relaxation algorithm. We show that if the feedback signal is large enough, the min-sum-min algorithm is guaranteed to converge to the optimal solution. Experimental results show that the min-sum-min algorithm outperforms two reference methods w.r.t. the convergence speed.
Cite
Text
Zhang and Heusdens. "Convergence of Min-Sum-Min Message-Passing for Quadratic Optimization." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2014. doi:10.1007/978-3-662-44845-8_23Markdown
[Zhang and Heusdens. "Convergence of Min-Sum-Min Message-Passing for Quadratic Optimization." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2014.](https://mlanthology.org/ecmlpkdd/2014/zhang2014ecmlpkdd-convergence/) doi:10.1007/978-3-662-44845-8_23BibTeX
@inproceedings{zhang2014ecmlpkdd-convergence,
title = {{Convergence of Min-Sum-Min Message-Passing for Quadratic Optimization}},
author = {Zhang, Guoqiang and Heusdens, Richard},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2014},
pages = {353-368},
doi = {10.1007/978-3-662-44845-8_23},
url = {https://mlanthology.org/ecmlpkdd/2014/zhang2014ecmlpkdd-convergence/}
}