Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise

Abstract

In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. More precisely, we interpret a large class of stochastic optimization methods as procedures that iteratively minimize a surrogate of the objective, which covers the stochastic gradient descent method and variants of the incremental approaches SAGA, SVRG, and MISO/Finito/SDCA. This point of view has several advantages: (i) we provide a simple generic proof of convergence for all of the aforementioned methods; (ii) we naturally obtain new algorithms with the same guarantees; (iii) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we propose a new accelerated stochastic gradient descent algorithm and a new accelerated SVRG algorithm that is robust to stochastic noise.

Cite

Text

Kulunchakov and Mairal. "Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise." Journal of Machine Learning Research, 2020.

Markdown

[Kulunchakov and Mairal. "Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise." Journal of Machine Learning Research, 2020.](https://mlanthology.org/jmlr/2020/kulunchakov2020jmlr-estimate/)

BibTeX

@article{kulunchakov2020jmlr-estimate,
  title     = {{Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise}},
  author    = {Kulunchakov, Andrei and Mairal, Julien},
  journal   = {Journal of Machine Learning Research},
  year      = {2020},
  pages     = {1-52},
  volume    = {21},
  url       = {https://mlanthology.org/jmlr/2020/kulunchakov2020jmlr-estimate/}
}