From Nesterov’s Estimate Sequence to Riemannian Acceleration

Abstract

We propose the first global accelerated gradient method for Riemannian manifolds. Toward establishing our results, we revisit Nesterov’s estimate sequence technique and develop a conceptually simple alternative from first principles. We then extend our analysis to Riemannian acceleration, localizing the key difficulty into “metric distortion.” We control this distortion via a novel geometric inequality, which enables us to formulate and analyze global Riemannian acceleration.

Cite

Text

Ahn and Sra. "From Nesterov’s Estimate Sequence to Riemannian Acceleration." Conference on Learning Theory, 2020.

Markdown

[Ahn and Sra. "From Nesterov’s Estimate Sequence to Riemannian Acceleration." Conference on Learning Theory, 2020.](https://mlanthology.org/colt/2020/ahn2020colt-nesterovs/)

BibTeX

@inproceedings{ahn2020colt-nesterovs,
  title     = {{From Nesterov’s Estimate Sequence to Riemannian Acceleration}},
  author    = {Ahn, Kwangjun and Sra, Suvrit},
  booktitle = {Conference on Learning Theory},
  year      = {2020},
  pages     = {84-118},
  volume    = {125},
  url       = {https://mlanthology.org/colt/2020/ahn2020colt-nesterovs/}
}