Implicit Riemannian Optimism with Applications to Min-Max Problems

Abstract

We introduce a Riemannian optimistic online learning algorithm for Hadamard manifolds based on inexact implicit updates. Unlike prior work, our method can handle in-manifold constraints, and matches the best known regret bounds in the Euclidean setting with no dependence on geometric constants, like the minimum curvature. Building on this, we develop algorithms for g-convex, g-concave smooth min-max problems on Hadamard manifolds. Notably, one method nearly matches the gradient oracle complexity of the lower bound for Euclidean problems, for the first time.

Cite

Text

Roux et al. "Implicit Riemannian Optimism with Applications to Min-Max Problems." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Roux et al. "Implicit Riemannian Optimism with Applications to Min-Max Problems." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/roux2025icml-implicit/)

BibTeX

@inproceedings{roux2025icml-implicit,
  title     = {{Implicit Riemannian Optimism with Applications to Min-Max Problems}},
  author    = {Roux, Christophe and Martı́nez-Rubio, David and Pokutta, Sebastian},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {52139-52172},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/roux2025icml-implicit/}
}