First-Order Methods for Geodesically Convex Optimization
Abstract
Geodesic convexity generalizes the notion of (vector space) convexity to nonlinear metric spaces. But unlike convex optimization, geodesically convex (g-convex) optimization is much less developed. In this paper we contribute to the understanding of g-convex optimization by developing iteration complexity analysis for several first-order algorithms on Hadamard manifolds. Specifically, we prove upper bounds for the global complexity of deterministic and stochastic (sub)gradient methods for optimizing smooth and nonsmooth g-convex functions, both with and without strong g-convexity. Our analysis also reveals how the manifold geometry, especially \emph{sectional curvature}, impacts convergence rates. To the best of our knowledge, our work is the first to provide global complexity analysis for first-order algorithms for general g-convex optimization.
Cite
Text
Zhang and Sra. "First-Order Methods for Geodesically Convex Optimization." Annual Conference on Computational Learning Theory, 2016.Markdown
[Zhang and Sra. "First-Order Methods for Geodesically Convex Optimization." Annual Conference on Computational Learning Theory, 2016.](https://mlanthology.org/colt/2016/zhang2016colt-first/)BibTeX
@inproceedings{zhang2016colt-first,
title = {{First-Order Methods for Geodesically Convex Optimization}},
author = {Zhang, Hongyi and Sra, Suvrit},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2016},
pages = {1617-1638},
url = {https://mlanthology.org/colt/2016/zhang2016colt-first/}
}