An Estimate Sequence for Geodesically Convex Optimization
Abstract
We propose a Riemannian version of Nesterov’s Accelerated Gradient algorithm (\textsc{Ragd}), and show that for \emph{geodesically} smooth and strongly convex problems, within a neighborhood of the minimizer whose radius depends on the condition number as well as the sectional curvature of the manifold, \textsc{Ragd} converges to the minimizer with acceleration. Unlike the algorithm in (Liu et al., 2017) that requires the exact solution to a nonlinear equation which in turn may be intractable, our algorithm is constructive and computationally tractable. Our proof exploits a new estimate sequence and a novel bound on the nonlinear metric distortion, both ideas may be of independent interest.
Cite
Text
Zhang and Sra. "An Estimate Sequence for Geodesically Convex Optimization." Annual Conference on Computational Learning Theory, 2018.Markdown
[Zhang and Sra. "An Estimate Sequence for Geodesically Convex Optimization." Annual Conference on Computational Learning Theory, 2018.](https://mlanthology.org/colt/2018/zhang2018colt-estimate/)BibTeX
@inproceedings{zhang2018colt-estimate,
title = {{An Estimate Sequence for Geodesically Convex Optimization}},
author = {Zhang, Hongyi and Sra, Suvrit},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2018},
pages = {1703-1723},
url = {https://mlanthology.org/colt/2018/zhang2018colt-estimate/}
}