Fast Gradient-Based Methods with Exponential Rate: A Hybrid Control Framework

Abstract

Ordinary differential equations, and in general a dynamical system viewpoint, have seen a resurgence of interest in developing fast optimization methods, mainly thanks to the availability of well-established analysis tools. In this study, we pursue a similar objective and propose a class of hybrid control systems that adopts a 2nd-order differential equation as its continuous flow. A distinctive feature of the proposed differential equation in comparison with the existing literature is a state-dependent, time-invariant damping term that acts as a feedback control input. Given a user-defined scalar $\alpha$, it is shown that the proposed control input steers the state trajectories to the global optimizer of a desired objective function with a guaranteed rate of convergence $\mathcal{O}(e^{-\alpha t})$. Our framework requires that the objective function satisfies the so called Polyak–Łojasiewicz inequality. Furthermore, a discretization method is introduced such that the resulting discrete dynamical system possesses an exponential rate of convergence.

Cite

Text

Kolarijani et al. "Fast Gradient-Based Methods with Exponential Rate: A Hybrid Control Framework." International Conference on Machine Learning, 2018.

Markdown

[Kolarijani et al. "Fast Gradient-Based Methods with Exponential Rate: A Hybrid Control Framework." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/kolarijani2018icml-fast/)

BibTeX

@inproceedings{kolarijani2018icml-fast,
  title     = {{Fast Gradient-Based Methods with Exponential Rate: A Hybrid Control Framework}},
  author    = {Kolarijani, Arman Sharifi and Esfahani, Peyman Mohajerin and Keviczky, Tamas},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {2728-2736},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/kolarijani2018icml-fast/}
}