On the Convergence of Stochastic Compositional Gradient Descent Ascent Method
Abstract
The compositional minimax problem covers plenty of machine learning models such as the distributionally robust compositional optimization problem. However, it is yet another understudied problem to optimize the compositional minimax problem. In this paper, we develop a novel efficient stochastic compositional gradient descent ascent method for optimizing the compositional minimax problem. Moreover, we establish the theoretical convergence rate of our proposed method. To the best of our knowledge, this is the first work achieving such a convergence rate for the compositional minimax problem. Finally, we conduct extensive experiments to demonstrate the effectiveness of our proposed method.
Cite
Text
Gao et al. "On the Convergence of Stochastic Compositional Gradient Descent Ascent Method." International Joint Conference on Artificial Intelligence, 2021. doi:10.24963/IJCAI.2021/329Markdown
[Gao et al. "On the Convergence of Stochastic Compositional Gradient Descent Ascent Method." International Joint Conference on Artificial Intelligence, 2021.](https://mlanthology.org/ijcai/2021/gao2021ijcai-convergence/) doi:10.24963/IJCAI.2021/329BibTeX
@inproceedings{gao2021ijcai-convergence,
title = {{On the Convergence of Stochastic Compositional Gradient Descent Ascent Method}},
author = {Gao, Hongchang and Wang, Xiaoqian and Luo, Lei and Shi, Xinghua},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2021},
pages = {2389-2395},
doi = {10.24963/IJCAI.2021/329},
url = {https://mlanthology.org/ijcai/2021/gao2021ijcai-convergence/}
}