RandProx: Primal-Dual Optimization Algorithms with Randomized Proximal Updates

Abstract

Proximal splitting algorithms are well suited for large-scale nonsmooth optimization problems. We propose a primal-dual algorithm, in which the dual update is randomized, with the proximity operator of one of the function replaced by a stochastic oracle. A nonsmooth variance-reduction technique is implemented so that the algorithm finds an exact minimizer of the general problem. We derive linear convergence results in presence of strong convexity. Several existing randomized algorithms, like Point-SAGA, are recovered as particular cases. Randomness helps getting faster algorithms; this has long been known for stochastic-gradient-type algorithms, and our work shows that this fully applies in the more general primal-dual setting as well.

Cite

Text

Condat and Richtárik. "RandProx: Primal-Dual Optimization Algorithms with Randomized Proximal Updates." NeurIPS 2022 Workshops: OPT, 2022.

Markdown

[Condat and Richtárik. "RandProx: Primal-Dual Optimization Algorithms with Randomized Proximal Updates." NeurIPS 2022 Workshops: OPT, 2022.](https://mlanthology.org/neuripsw/2022/condat2022neuripsw-randprox/)

BibTeX

@inproceedings{condat2022neuripsw-randprox,
  title     = {{RandProx: Primal-Dual Optimization Algorithms with Randomized Proximal Updates}},
  author    = {Condat, Laurent and Richtárik, Peter},
  booktitle = {NeurIPS 2022 Workshops: OPT},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/condat2022neuripsw-randprox/}
}