Halpern Iteration for Near-Optimal and Parameter-Free Monotone Inclusion and Strong Solutions to Variational Inequalities

Abstract

We leverage the connections between nonexpansive maps, monotone Lipschitz operators, and proximal mappings to obtain near-optimal (i.e., optimal up to poly-log factors in terms of iteration complexity) and parameter-free methods for solving monotone inclusion problems. These results immediately translate into near-optimal guarantees for approximating strong solutions to variational inequality problems, approximating convex-concave min-max optimization problems, and minimizing the norm of the gradient in min-max optimization problems. Our analysis is based on a novel and simple potential-based proof of convergence of Halpern iteration, a classical iteration for finding fixed points of nonexpansive maps. Additionally, we provide a series of algorithmic reductions that highlight connections between different problem classes and lead to lower bounds that certify near-optimality of the studied methods.

Cite

Text

Diakonikolas. "Halpern Iteration for Near-Optimal and Parameter-Free Monotone Inclusion and Strong Solutions to Variational Inequalities." Conference on Learning Theory, 2020.

Markdown

[Diakonikolas. "Halpern Iteration for Near-Optimal and Parameter-Free Monotone Inclusion and Strong Solutions to Variational Inequalities." Conference on Learning Theory, 2020.](https://mlanthology.org/colt/2020/diakonikolas2020colt-halpern/)

BibTeX

@inproceedings{diakonikolas2020colt-halpern,
  title     = {{Halpern Iteration for Near-Optimal and Parameter-Free Monotone Inclusion and Strong Solutions to Variational Inequalities}},
  author    = {Diakonikolas, Jelena},
  booktitle = {Conference on Learning Theory},
  year      = {2020},
  pages     = {1428-1451},
  volume    = {125},
  url       = {https://mlanthology.org/colt/2020/diakonikolas2020colt-halpern/}
}