Accelerated Mirror Descent in Continuous and Discrete Time

Abstract

We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original continuous-time motivation of mirror descent with a recent ODE interpretation of Nesterov's accelerated method, we propose a family of continuous-time descent dynamics for convex functions with Lipschitz gradients, such that the solution trajectories are guaranteed to converge to the optimum at a $O(1/t^2)$ rate. We then show that a large family of first-order accelerated methods can be obtained as a discretization of the ODE, and these methods converge at a $O(1/k^2)$ rate. This connection between accelerated mirror descent and the ODE provides an intuitive approach to the design and analysis of accelerated first-order algorithms.

Cite

Text

Krichene et al. "Accelerated Mirror Descent in Continuous and Discrete Time." Neural Information Processing Systems, 2015.

Markdown

[Krichene et al. "Accelerated Mirror Descent in Continuous and Discrete Time." Neural Information Processing Systems, 2015.](https://mlanthology.org/neurips/2015/krichene2015neurips-accelerated/)

BibTeX

@inproceedings{krichene2015neurips-accelerated,
  title     = {{Accelerated Mirror Descent in Continuous and Discrete Time}},
  author    = {Krichene, Walid and Bayen, Alexandre and Bartlett, Peter L},
  booktitle = {Neural Information Processing Systems},
  year      = {2015},
  pages     = {2845-2853},
  url       = {https://mlanthology.org/neurips/2015/krichene2015neurips-accelerated/}
}