Sufficient Conditions for Non-Asymptotic Convergence of Riemannian Optimization Methods
Abstract
Motivated by energy based analyses for descent methods in the Euclidean setting, we investigate a generalisation of such analyses for descent methods over Riemannian manifolds. In doing so, we find that it is possible to derive curvature-free guarantees for such descent methods. This also enables us to give the first known guarantees for a Riemannian cubic-regularised Newton algorithm over g-convex functions, which extends the guarantees by Agarwal et al [2021] for an adaptive Riemannian cubic-regularised Newton algorithm over general non-convex functions. This analysis motivates us to study acceleration of Riemannian gradient descent in the g-convex setting, and we improve on an existing result by Alimisis et al [2021], albeit with a curvature-dependent rate. Finally, extending the analysis by Ahn and Sra [2020], we attempt to provide some sufficient conditions for the acceleration of Riemannian descent methods in the strongly geodesically convex setting.
Cite
Text
Srinivasan and Wilson. "Sufficient Conditions for Non-Asymptotic Convergence of Riemannian Optimization Methods." NeurIPS 2022 Workshops: OPT, 2022.Markdown
[Srinivasan and Wilson. "Sufficient Conditions for Non-Asymptotic Convergence of Riemannian Optimization Methods." NeurIPS 2022 Workshops: OPT, 2022.](https://mlanthology.org/neuripsw/2022/srinivasan2022neuripsw-sufficient/)BibTeX
@inproceedings{srinivasan2022neuripsw-sufficient,
title = {{Sufficient Conditions for Non-Asymptotic Convergence of Riemannian Optimization Methods}},
author = {Srinivasan, Vishwak and Wilson, Ashia Camage},
booktitle = {NeurIPS 2022 Workshops: OPT},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/srinivasan2022neuripsw-sufficient/}
}