An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization

Abstract

This paper studies the stochastic nonconvex-strongly-concave minimax optimization over a multi-agent network. We propose an efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the $\epsilon$-stationary points. Concretely, it requires $\mathcal{O}(\min (\kappa^3\epsilon^{-3},\kappa^2 \sqrt{N} \epsilon^{-2} ))$ stochastic first-order oracle (SFO) calls and $\tilde \mathcal O(\kappa^2 \epsilon^{-2})$ communication rounds, where $\kappa$ is the condition number and $N$ is the total number of individual functions. Our numerical experiments also validate the superiority of DREAM over previous methods.

Cite

Text

Chen et al. "An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization." Artificial Intelligence and Statistics, 2024.

Markdown

[Chen et al. "An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/chen2024aistats-efficient/)

BibTeX

@inproceedings{chen2024aistats-efficient,
  title     = {{An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization}},
  author    = {Chen, Lesi and Ye, Haishan and Luo, Luo},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2024},
  pages     = {1990-1998},
  volume    = {238},
  url       = {https://mlanthology.org/aistats/2024/chen2024aistats-efficient/}
}