Jointly Improving the Sample and Communication Complexities in Decentralized Stochastic Minimax Optimization
Abstract
We propose a novel single-loop decentralized algorithm, DGDA-VR, for solving the stochastic nonconvex strongly-concave minimax problems over a connected network of agents, which are equipped with stochastic first-order oracles to estimate their local gradients. DGDA-VR, incorporating variance reduction, achieves O(ε^−3) oracle complexity and O(ε^−2) communication complexity without resorting to multi-communication rounds – both are optimal, i.e., matching the lower bounds for this class of problems. Since DGDA-VR does not require multiple communication rounds, it is applicable to a broader range of decentralized computational environments. To the best of our knowledge, this is the first distributed method using a single communication round in each iteration to jointly optimize the oracle and communication complexities for the problem considered here.
Cite
Text
Zhang et al. "Jointly Improving the Sample and Communication Complexities in Decentralized Stochastic Minimax Optimization." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I18.30076Markdown
[Zhang et al. "Jointly Improving the Sample and Communication Complexities in Decentralized Stochastic Minimax Optimization." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/zhang2024aaai-jointly/) doi:10.1609/AAAI.V38I18.30076BibTeX
@inproceedings{zhang2024aaai-jointly,
title = {{Jointly Improving the Sample and Communication Complexities in Decentralized Stochastic Minimax Optimization}},
author = {Zhang, Xuan and Mancino-Ball, Gabriel and Aybat, Necdet Serhat and Xu, Yangyang},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2024},
pages = {20865-20873},
doi = {10.1609/AAAI.V38I18.30076},
url = {https://mlanthology.org/aaai/2024/zhang2024aaai-jointly/}
}