Online Optimization over Riemannian Manifolds
Abstract
Online optimization has witnessed a massive surge of research attention in recent years. In this paper, we propose online gradient descent and online bandit algorithms over Riemannian manifolds in full information and bandit feedback settings respectively, for both geodesically convex and strongly geodesically convex functions. We establish a series of upper bounds on the regrets for the proposed algorithms over Hadamard manifolds. We also find a universal lower bound for achievable regret on Hadamard manifolds. Our analysis shows how time horizon, dimension, and sectional curvature bounds have impact on the regret bounds. When the manifold permits positive sectional curvature, we prove similar regret bound can be established by handling non-constrictive project maps. In addition, numerical studies on problems defined on symmetric positive definite matrix manifold, hyperbolic spaces, and Grassmann manifolds are provided to validate our theoretical findings, using synthetic and real-world data.
Cite
Text
Wang et al. "Online Optimization over Riemannian Manifolds." Journal of Machine Learning Research, 2023.Markdown
[Wang et al. "Online Optimization over Riemannian Manifolds." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/wang2023jmlr-online/)BibTeX
@article{wang2023jmlr-online,
title = {{Online Optimization over Riemannian Manifolds}},
author = {Wang, Xi and Tu, Zhipeng and Hong, Yiguang and Wu, Yingyi and Shi, Guodong},
journal = {Journal of Machine Learning Research},
year = {2023},
pages = {1-67},
volume = {24},
url = {https://mlanthology.org/jmlr/2023/wang2023jmlr-online/}
}