Enhanced Adaptive Gradient Algorithms for Nonconvex-PL Minimax Optimization
Abstract
Minimax optimization recently is widely applied in many machine learning tasks such as generative adversarial networks, robust learning and reinforcement learning. In the paper, we study a class of nonconvex-nonconcave minimax optimization with nonsmooth regularization, where the objective function is possibly nonconvex on primal variable $x$, and it is nonconcave and satisfies the Polyak-Lojasiewicz (PL) condition on dual variable $y$. Moreover, we propose a class of enhanced momentum-based gradient descent ascent methods (i.e., MSGDA and AdaMSGDA) to solve these stochastic nonconvex-PL minimax problems. In particular, our AdaMSGDA algorithm can use various adaptive learning rates in updating the variables $x$ and $y$ without relying on any specifical types. Theoretically, we prove that our methods have the best known sample complexity of $\tilde{O}(\epsilon^{-3})$ only requiring one sample at each loop in finding an $\epsilon$-stationary solution. Some numerical experiments on PL-game and Wasserstein-GAN demonstrate the efficiency of our proposed methods.
Cite
Text
Huang et al. "Enhanced Adaptive Gradient Algorithms for Nonconvex-PL Minimax Optimization." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.Markdown
[Huang et al. "Enhanced Adaptive Gradient Algorithms for Nonconvex-PL Minimax Optimization." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.](https://mlanthology.org/aistats/2025/huang2025aistats-enhanced/)BibTeX
@inproceedings{huang2025aistats-enhanced,
title = {{Enhanced Adaptive Gradient Algorithms for Nonconvex-PL Minimax Optimization}},
author = {Huang, Feihu and Xuan, Chunyu and Wang, Xinrui and Zhang, Siqi and Chen, Songcan},
booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics},
year = {2025},
pages = {3439-3447},
volume = {258},
url = {https://mlanthology.org/aistats/2025/huang2025aistats-enhanced/}
}