Prox-PDA: The Proximal Primal-Dual Algorithm for Fast Distributed Nonconvex Optimization and Learning over Networks
Abstract
In this paper we consider nonconvex optimization and learning over a network of distributed nodes. We develop a Proximal Primal-Dual Algorithm (Prox-PDA), which enables the network nodes to distributedly and collectively compute the set of first-order stationary solutions in a global sublinear manner [with a rate of $O(1/r)$, where $r$ is the iteration counter]. To the best of our knowledge, this is the first algorithm that enables distributed nonconvex optimization with global rate guarantees. Our numerical experiments also demonstrate the effectiveness of the proposed algorithm.
Cite
Text
Hong et al. "Prox-PDA: The Proximal Primal-Dual Algorithm for Fast Distributed Nonconvex Optimization and Learning over Networks." International Conference on Machine Learning, 2017.Markdown
[Hong et al. "Prox-PDA: The Proximal Primal-Dual Algorithm for Fast Distributed Nonconvex Optimization and Learning over Networks." International Conference on Machine Learning, 2017.](https://mlanthology.org/icml/2017/hong2017icml-proxpda/)BibTeX
@inproceedings{hong2017icml-proxpda,
title = {{Prox-PDA: The Proximal Primal-Dual Algorithm for Fast Distributed Nonconvex Optimization and Learning over Networks}},
author = {Hong, Mingyi and Hajinezhad, Davood and Zhao, Ming-Min},
booktitle = {International Conference on Machine Learning},
year = {2017},
pages = {1529-1538},
volume = {70},
url = {https://mlanthology.org/icml/2017/hong2017icml-proxpda/}
}