Acceleration via Symplectic Discretization of High-Resolution Differential Equations

Abstract

We study first-order optimization algorithms obtained by discretizing ordinary differential equations (ODEs) corresponding to Nesterov’s accelerated gradient methods (NAGs) and Polyak’s heavy-ball method. We consider three discretization schemes: symplectic Euler (S), explicit Euler (E) and implicit Euler (I) schemes. We show that the optimization algorithm generated by applying the symplectic scheme to a high-resolution ODE proposed by Shi et al. [2018] achieves the accelerated rate for minimizing both strongly convex function and convex function. On the other hand, the resulting algorithm either fails to achieve acceleration or is impractical when the scheme is implicit, the ODE is low-resolution, or the scheme is explicit.

Cite

Text

Shi et al. "Acceleration via Symplectic Discretization of High-Resolution Differential Equations." Neural Information Processing Systems, 2019.

Markdown

[Shi et al. "Acceleration via Symplectic Discretization of High-Resolution Differential Equations." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/shi2019neurips-acceleration/)

BibTeX

@inproceedings{shi2019neurips-acceleration,
  title     = {{Acceleration via Symplectic Discretization of High-Resolution Differential Equations}},
  author    = {Shi, Bin and Du, Simon S and Su, Weijie and Jordan, Michael I},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {5744-5752},
  url       = {https://mlanthology.org/neurips/2019/shi2019neurips-acceleration/}
}