SDEs for Minimax Optimization
Abstract
Minimax optimization problems have attracted a lot of attention over the past few years, with applications ranging from economics to machine learning. While advanced optimization methods exist for such problems, characterizing their dynamics in stochastic scenarios remains notably challenging. In this paper, we pioneer the use of stochastic differential equations (SDEs) to analyze and compare Minimax optimizers. Our SDE models for Stochastic Gradient Descent-Ascent, Stochastic Extragradient, and Stochastic Hamiltonian Gradient Descent are provable approximations of their algorithmic counterparts, clearly showcasing the interplay between hyperparameters, implicit regularization, and implicit curvature-induced noise. This perspective also allows for a unified and simplified analysis strategy based on the principles of Itô calculus. Finally, our approach facilitates the derivation of convergence conditions and closed-form solutions for the dynamics in simplified settings, unveiling further insights into the behavior of different optimizers.
Cite
Text
Monzio Compagnoni et al. "SDEs for Minimax Optimization." Artificial Intelligence and Statistics, 2024.Markdown
[Monzio Compagnoni et al. "SDEs for Minimax Optimization." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/monziocompagnoni2024aistats-sdes/)BibTeX
@inproceedings{monziocompagnoni2024aistats-sdes,
title = {{SDEs for Minimax Optimization}},
author = {Monzio Compagnoni, Enea and Orvieto, Antonio and Kersting, Hans and Proske, Frank and Lucchi, Aurelien},
booktitle = {Artificial Intelligence and Statistics},
year = {2024},
pages = {4834-4842},
volume = {238},
url = {https://mlanthology.org/aistats/2024/monziocompagnoni2024aistats-sdes/}
}