Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees

Abstract

We present a novel way of generating Lyapunov functions for proving linear convergence rates of first-order optimization methods. Our approach provably obtains the fastest linear convergence rate that can be verified by a quadratic Lyapunov function (with given states), and only relies on solving a small-sized semidefinite program. Our approach combines the advantages of performance estimation problems (PEP, due to Drori and Teboulle (2014)) and integral quadratic constraints (IQC, due to Lessard et al. (2016)), and relies on convex interpolation (due to Taylor et al. (2017c;b)).

Cite

Text

Taylor et al. "Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees." International Conference on Machine Learning, 2018.

Markdown

[Taylor et al. "Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/taylor2018icml-lyapunov/)

BibTeX

@inproceedings{taylor2018icml-lyapunov,
  title     = {{Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees}},
  author    = {Taylor, Adrien and Van Scoy, Bryan and Lessard, Laurent},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {4897-4906},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/taylor2018icml-lyapunov/}
}