Riemannian Stochastic Recursive Gradient Algorithm

Abstract

Stochastic variance reduction algorithms have recently become popular for minimizing the average of a large, but finite number of loss functions on a Riemannian manifold. The present paper proposes a Riemannian stochastic recursive gradient algorithm (R-SRG), which does not require the inverse of retraction between two distant iterates on the manifold. Convergence analyses of R-SRG are performed on both retraction-convex and non-convex functions under computationally efficient retraction and vector transport operations. The key challenge is analysis of the influence of vector transport along the retraction curve. Numerical evaluations reveal that R-SRG competes well with state-of-the-art Riemannian batch and stochastic gradient algorithms.

Cite

Text

Kasai et al. "Riemannian Stochastic Recursive Gradient Algorithm." International Conference on Machine Learning, 2018.

Markdown

[Kasai et al. "Riemannian Stochastic Recursive Gradient Algorithm." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/kasai2018icml-riemannian/)

BibTeX

@inproceedings{kasai2018icml-riemannian,
  title     = {{Riemannian Stochastic Recursive Gradient Algorithm}},
  author    = {Kasai, Hiroyuki and Sato, Hiroyuki and Mishra, Bamdev},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {2516-2524},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/kasai2018icml-riemannian/}
}