Super-Acceleration with Cyclical Step-Sizes

Abstract

We develop a convergence-rate analysis of momentum with cyclical step-sizes. We show that under some assumption on the spectral gap of Hessians in machine learning, cyclical step-sizes are provably faster than constant step-sizes. More precisely, we develop a convergence rate analysis for quadratic objectives that provides optimal parameters and shows that cyclical learning rates can improve upon traditional lower complexity bounds. We further propose a systematic approach to design optimal first order methods for quadratic minimization with a given spectral structure. Finally, we provide a local convergence rate analysis beyond quadratic minimization for the proposed methods and illustrate our findings through benchmarks on least squares and logistic regression problems.

Cite

Text

Goujaud et al. "Super-Acceleration with Cyclical Step-Sizes." Artificial Intelligence and Statistics, 2022.

Markdown

[Goujaud et al. "Super-Acceleration with Cyclical Step-Sizes." Artificial Intelligence and Statistics, 2022.](https://mlanthology.org/aistats/2022/goujaud2022aistats-superacceleration/)

BibTeX

@inproceedings{goujaud2022aistats-superacceleration,
  title     = {{Super-Acceleration with Cyclical Step-Sizes}},
  author    = {Goujaud, Baptiste and Scieur, Damien and Dieuleveut, Aymeric and Taylor, Adrien B. and Pedregosa, Fabian},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2022},
  pages     = {3028-3065},
  volume    = {151},
  url       = {https://mlanthology.org/aistats/2022/goujaud2022aistats-superacceleration/}
}