Decentralized Sum-of-Nonconvex Optimization

Abstract

We consider the optimization problem of minimizing the sum-of-nonconvex function, i.e., a convex function that is the average of nonconvex components. The existing stochastic algorithms for such a problem only focus on a single machine and the centralized scenario. In this paper, we study the sum-of-nonconvex optimization in the decentralized setting. We present a new theoretical analysis of the PMGT-SVRG algorithm for this problem and prove the linear convergence of their approach. However, the convergence rate of the PMGT-SVRG algorithm has a linear dependency on the condition number, which is undesirable for the ill-conditioned problem. To remedy this issue, we propose an accelerated stochastic decentralized first-order algorithm by incorporating the techniques of acceleration, gradient tracking, and multi-consensus mixing into the SVRG algorithm. The convergence rate of the proposed method has a square-root dependency on the condition number. The numerical experiments validate the theoretical guarantee of our proposed algorithms on both synthetic and real-world datasets.

Cite

Text

Liu and Low. "Decentralized Sum-of-Nonconvex Optimization." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I13.29318

Markdown

[Liu and Low. "Decentralized Sum-of-Nonconvex Optimization." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/liu2024aaai-decentralized-a/) doi:10.1609/AAAI.V38I13.29318

BibTeX

@inproceedings{liu2024aaai-decentralized-a,
  title     = {{Decentralized Sum-of-Nonconvex Optimization}},
  author    = {Liu, Zhuanghua and Low, Bryan Kian Hsiang},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {14088-14096},
  doi       = {10.1609/AAAI.V38I13.29318},
  url       = {https://mlanthology.org/aaai/2024/liu2024aaai-decentralized-a/}
}