Optimal Extragradient-Based Algorithms for Stochastic Variational Inequalities with Separable Structure

Abstract

We consider the problem of solving stochastic monotone variational inequalities with a separable structure using a stochastic first-order oracle. Building on standard extragradient for variational inequalities we propose a novel algorithm---stochastic \emph{accelerated gradient-extragradient} (AG-EG)---for strongly monotone variational inequalities (VIs). Our approach combines the strengths of extragradient and Nesterov acceleration. By showing that its iterates remain in a bounded domain and applying scheduled restarting, we prove that AG-EG has an optimal convergence rate for strongly monotone VIs. Furthermore, when specializing to the particular case of bilinearly coupled strongly-convex-strongly-concave saddle-point problems, including bilinear games, our algorithm achieves fine-grained convergence rates that match the respective lower bounds, with the stochasticity being characterized by an additive statistical error term that is optimal up to a constant prefactor.

Cite

Text

Yuan et al. "Optimal Extragradient-Based Algorithms for Stochastic Variational Inequalities with Separable Structure." Neural Information Processing Systems, 2023.

Markdown

[Yuan et al. "Optimal Extragradient-Based Algorithms for Stochastic Variational Inequalities with Separable Structure." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/yuan2023neurips-optimal/)

BibTeX

@inproceedings{yuan2023neurips-optimal,
  title     = {{Optimal Extragradient-Based Algorithms for Stochastic Variational Inequalities with Separable Structure}},
  author    = {Yuan, Angela and Li, Chris Junchi and Gidel, Gauthier and Jordan, Michael I. and Gu, Quanquan and Du, Simon S},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/yuan2023neurips-optimal/}
}