Generalized Smooth Stochastic Variational Inequalities: Almost Sure Convergence and Convergence Rates
Abstract
This paper focuses on solving a stochastic variational inequality (SVI) problem under relaxed smoothness assumption for a class of structured non-monotone operators. The SVI problem has attracted significant interest in the machine learning community due to its immediate application to adversarial training and multi-agent reinforcement learning. In many such applications, the resulting operators do not satisfy the smoothness assumption. To address this issue, we focus on a weaker generalized smoothness assumption called $\alpha$-symmetric. Under $p$-quasi sharpness and $\alpha$-symmetric assumptions on the operator, we study clipped projection (gradient descent-ascent) and clipped Korpelevich (extragradient) methods. For these clipped methods, we provide the first almost-sure convergence results without making any assumptions on the boundedness of either the stochastic operator or the stochastic samples. We also provide the first in-expectation unbiased convergence rate results for these methods under a relaxed smoothness assumption for $\alpha \leq \frac{1}{2}$.
Cite
Text
Vankov et al. "Generalized Smooth Stochastic Variational Inequalities: Almost Sure Convergence and Convergence Rates." Transactions on Machine Learning Research, 2025.Markdown
[Vankov et al. "Generalized Smooth Stochastic Variational Inequalities: Almost Sure Convergence and Convergence Rates." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/vankov2025tmlr-generalized/)BibTeX
@article{vankov2025tmlr-generalized,
title = {{Generalized Smooth Stochastic Variational Inequalities: Almost Sure Convergence and Convergence Rates}},
author = {Vankov, Daniil and Nedich, Angelia and Sankar, Lalitha},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/vankov2025tmlr-generalized/}
}