On the Hypomonotone Class of Variational Inequalities
Abstract
This paper studies the behavior of the extragradient algorithm when applied to hypomonotone operators, a class of problems that extends beyond the classical monotone setting. While the extragradient method is widely known for its efficacy in solving variational inequalities with monotone and Lipschitz continuous operators, we demonstrate that its convergence is not guaranteed in the hypomonotone setting. We construct an example such that by choosing the starting point for all every size the extragradient diverges. Our results highlight the necessity of stronger assumptions to guarantee convergence of extragradient and to further develop the existing VI methods for broader problems.
Cite
Text
Alomar and Chavdarova. "On the Hypomonotone Class of Variational Inequalities." NeurIPS 2024 Workshops: OPT, 2024.Markdown
[Alomar and Chavdarova. "On the Hypomonotone Class of Variational Inequalities." NeurIPS 2024 Workshops: OPT, 2024.](https://mlanthology.org/neuripsw/2024/alomar2024neuripsw-hypomonotone/)BibTeX
@inproceedings{alomar2024neuripsw-hypomonotone,
title = {{On the Hypomonotone Class of Variational Inequalities}},
author = {Alomar, Khaled and Chavdarova, Tatjana},
booktitle = {NeurIPS 2024 Workshops: OPT},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/alomar2024neuripsw-hypomonotone/}
}