A One-Sample Decentralized Proximal Algorithm for Non-Convex Stochastic Composite Optimization
Abstract
We focus on decentralized stochastic non-convex optimization, where $n$ agents work together to optimize a composite objective function which is a sum of a smooth term and a non-smooth convex term. To solve this problem, we propose two single-time scale algorithms: \texttt{Prox-DASA} and \texttt{Prox-DASA-GT}. These algorithms can find $\epsilon$-stationary points in $\mathcal{O}(n^{-1}\epsilon^{-2})$ iterations using constant batch sizes (i.e., $\mathcal{O}(1)$). Unlike prior work, our algorithms achieve comparable complexity without requiring large batch sizes, more complex per-iteration operations (such as double loops), or stronger assumptions. Our theoretical findings are supported by extensive numerical experiments, which demonstrate the superiority of our algorithms over previous approaches. Our code is available at \url{https://github.com/xuxingc/ProxDASA}.
Cite
Text
Xiao et al. "A One-Sample Decentralized Proximal Algorithm for Non-Convex Stochastic Composite Optimization." Uncertainty in Artificial Intelligence, 2023.Markdown
[Xiao et al. "A One-Sample Decentralized Proximal Algorithm for Non-Convex Stochastic Composite Optimization." Uncertainty in Artificial Intelligence, 2023.](https://mlanthology.org/uai/2023/xiao2023uai-onesample/)BibTeX
@inproceedings{xiao2023uai-onesample,
title = {{A One-Sample Decentralized Proximal Algorithm for Non-Convex Stochastic Composite Optimization}},
author = {Xiao, Tesi and Chen, Xuxing and Balasubramanian, Krishnakumar and Ghadimi, Saeed},
booktitle = {Uncertainty in Artificial Intelligence},
year = {2023},
pages = {2324-2334},
volume = {216},
url = {https://mlanthology.org/uai/2023/xiao2023uai-onesample/}
}