Open Problem: Polynomial Linearly-Convergent Method for G-Convex Optimization?
Abstract
Let $f \colon \mathcal{M} \to \mathbb{R}$ be a Lipschitz and geodesically convex function defined on a $d$-dimensional Riemannian manifold $\mathcal{M}$. Does there exist a first-order deterministic algorithm which (a) uses at most $O(\mathrm{poly}(d) \log(\epsilon^{-1}))$ subgradient queries to find a point with target accuracy $\epsilon$, and (b) requires only $O(\mathrm{poly}(d))$ arithmetic operations per query? In convex optimization, the classical ellipsoid method achieves this. After detailing related work, we provide an ellipsoid-like algorithm with query complexity $O(d^2 \log^2(\epsilon^{-1}))$ and per-query complexity $O(d^2)$ for the limited case where $\mathcal{M}$ has constant curvature (hemisphere or hyperbolic space). We then detail possible approaches and corresponding obstacles for designing an ellipsoid-like method for general Riemannian manifolds.
Cite
Text
Criscitiello et al. "Open Problem: Polynomial Linearly-Convergent Method for G-Convex Optimization?." Conference on Learning Theory, 2023.Markdown
[Criscitiello et al. "Open Problem: Polynomial Linearly-Convergent Method for G-Convex Optimization?." Conference on Learning Theory, 2023.](https://mlanthology.org/colt/2023/criscitiello2023colt-open/)BibTeX
@inproceedings{criscitiello2023colt-open,
title = {{Open Problem: Polynomial Linearly-Convergent Method for G-Convex Optimization?}},
author = {Criscitiello, Christopher and Martínez-Rubio, David and Boumal, Nicolas},
booktitle = {Conference on Learning Theory},
year = {2023},
pages = {5950-5956},
volume = {195},
url = {https://mlanthology.org/colt/2023/criscitiello2023colt-open/}
}