A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise

Abstract

We consider variational inequalities coming from monotone operators, a setting that includes convex minimization and convex-concave saddle-point problems. We assume an access to potentially noisy unbiased values of the monotone operators and assess convergence through a compatible gap function which corresponds to the standard optimality criteria in the aforementioned subcases. We present a universal algorithm for these inequalities based on the Mirror-Prox algorithm. Concretely, our algorithm \emph{simultaneously} achieves the optimal rates for the smooth/non-smooth, and noisy/noiseless settings. This is done without any prior knowledge of these properties, and in the general set-up of arbitrary norms and compatible Bregman divergences. For convex minimization and convex-concave saddle-point problems, this leads to new adaptive algorithms. Our method relies on a novel yet simple adaptive choice of the step-size, which can be seen as the appropriate extension of AdaGrad to handle constrained problems.

Cite

Text

Bach and Levy. "A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise." Conference on Learning Theory, 2019.

Markdown

[Bach and Levy. "A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise." Conference on Learning Theory, 2019.](https://mlanthology.org/colt/2019/bach2019colt-universal/)

BibTeX

@inproceedings{bach2019colt-universal,
  title     = {{A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise}},
  author    = {Bach, Francis and Levy, Kfir Y},
  booktitle = {Conference on Learning Theory},
  year      = {2019},
  pages     = {164-194},
  volume    = {99},
  url       = {https://mlanthology.org/colt/2019/bach2019colt-universal/}
}