The Physical Systems Behind Optimization Algorithms

Abstract

We use differential equations based approaches to provide some {\it \textbf{physics}} insights into analyzing the dynamics of popular optimization algorithms in machine learning. In particular, we study gradient descent, proximal gradient descent, coordinate gradient descent, proximal coordinate gradient, and Newton's methods as well as their Nesterov's accelerated variants in a unified framework motivated by a natural connection of optimization algorithms to physical systems. Our analysis is applicable to more general algorithms and optimization problems {\it \textbf{beyond}} convexity and strong convexity, e.g. Polyak-\L ojasiewicz and error bound conditions (possibly nonconvex).

Cite

Text

Yang et al. "The Physical Systems Behind Optimization Algorithms." Neural Information Processing Systems, 2018.

Markdown

[Yang et al. "The Physical Systems Behind Optimization Algorithms." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/yang2018neurips-physical/)

BibTeX

@inproceedings{yang2018neurips-physical,
  title     = {{The Physical Systems Behind Optimization Algorithms}},
  author    = {Yang, Lin and Arora, Raman and Braverman, Vladimir and Zhao, Tuo},
  booktitle = {Neural Information Processing Systems},
  year      = {2018},
  pages     = {4372-4381},
  url       = {https://mlanthology.org/neurips/2018/yang2018neurips-physical/}
}