A Unifying Framework for Variance-Reduced Algorithms for Findings Zeroes of Monotone Operators

Abstract

It is common to encounter large-scale monotone inclusion problems where the objective has a finite sum structure. We develop a general framework for variance-reduced forward-backward splitting algorithms for this problem. This framework includes a number of existing deterministic and variance-reduced algorithms for function minimization as special cases, and it is also applicable to more general problems such as saddle-point problems and variational inequalities. With a carefully constructed Lyapunov function, we show that the algorithms covered by our framework enjoy a linear convergence rate in expectation under mild assumptions. We further consider Catalyst acceleration and asynchronous implementation to reduce the algorithmic complexity and computation time. We apply our proposed framework to a policy evaluation problem and a strongly monotone two-player game, both of which fall outside the realm of function minimization.

Cite

Text

Zhang et al. "A Unifying Framework for Variance-Reduced Algorithms for Findings Zeroes of Monotone Operators." Journal of Machine Learning Research, 2022.

Markdown

[Zhang et al. "A Unifying Framework for Variance-Reduced Algorithms for Findings Zeroes of Monotone Operators." Journal of Machine Learning Research, 2022.](https://mlanthology.org/jmlr/2022/zhang2022jmlr-unifying/)

BibTeX

@article{zhang2022jmlr-unifying,
  title     = {{A Unifying Framework for Variance-Reduced Algorithms for Findings Zeroes of Monotone Operators}},
  author    = {Zhang, Xun and Haskell, William B. and Ye, Zhisheng},
  journal   = {Journal of Machine Learning Research},
  year      = {2022},
  pages     = {1-44},
  volume    = {23},
  url       = {https://mlanthology.org/jmlr/2022/zhang2022jmlr-unifying/}
}