Stochastic Proximal Point Methods for Monotone Inclusions Under Expected Similarity

Abstract

Monotone inclusions have a wide range of applications, including minimization, saddle-point, and equilibria problems. We introduce new stochastic algorithms, with or without variance reduction, to estimate a root of the expectation of possibly set-valued monotone operators, using at every iteration one call to the resolvent of a randomly sampled operator. We also introduce a notion of similarity between the operators, which holds even for discontinuous operators. We leverage it to derive linear convergence results in the strongly monotone setting.

Cite

Text

Sadiev et al. "Stochastic Proximal Point Methods for Monotone Inclusions Under Expected Similarity." NeurIPS 2024 Workshops: OPT, 2024.

Markdown

[Sadiev et al. "Stochastic Proximal Point Methods for Monotone Inclusions Under Expected Similarity." NeurIPS 2024 Workshops: OPT, 2024.](https://mlanthology.org/neuripsw/2024/sadiev2024neuripsw-stochastic/)

BibTeX

@inproceedings{sadiev2024neuripsw-stochastic,
  title     = {{Stochastic Proximal Point Methods for Monotone Inclusions Under Expected Similarity}},
  author    = {Sadiev, Abdurakhmon and Condat, Laurent and Richtárik, Peter},
  booktitle = {NeurIPS 2024 Workshops: OPT},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/sadiev2024neuripsw-stochastic/}
}