A Fast Optimistic Method for Monotone Variational Inequalities
Abstract
We study monotone variational inequalities that can arise as optimality conditions for constrained convex optimization or convex-concave minimax problems and propose a novel algorithm that uses only one gradient/operator evaluation and one projection onto the constraint set per iteration. The algorithm, which we call fOGDA-VI, achieves a $o(\frac{1}{k})$ rate of convergence in terms of the restricted gap function as well as the natural residual for the last iterate. Moreover, we provide a convergence guarantee for the sequence of iterates to a solution of the variational inequality. These are the best theoretical convergence results for numerical methods for (only) monotone variational inequalities reported in the literature. To empirically validate our algorithm we investigate a two-player matrix game with mixed strategies of the two players. Concluding, we show promising results regarding the application of fOGDA-VI to the training of generative adversarial nets.
Cite
Text
Sedlmayer et al. "A Fast Optimistic Method for Monotone Variational Inequalities." International Conference on Machine Learning, 2023.Markdown
[Sedlmayer et al. "A Fast Optimistic Method for Monotone Variational Inequalities." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/sedlmayer2023icml-fast/)BibTeX
@inproceedings{sedlmayer2023icml-fast,
title = {{A Fast Optimistic Method for Monotone Variational Inequalities}},
author = {Sedlmayer, Michael and Nguyen, Dang-Khoa and Bot, Radu Ioan},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {30406-30438},
volume = {202},
url = {https://mlanthology.org/icml/2023/sedlmayer2023icml-fast/}
}