Riemannian Stochastic Quasi-Newton Algorithm with Variance Reduction and Its Convergence Analysis

Abstract

Stochastic variance reduction algorithms have recently become popular for minimizing the average of a large, but finite number of loss functions. The present paper proposes a Riemannian stochastic quasi-Newton algorithm with variance reduction (R-SQN-VR). The key challenges of averaging, adding, and subtracting multiple gradients are addressed with notions of retraction and vector transport. We present convergence analyses of R-SQN-VR on both non-convex and retraction-convex functions under retraction and vector transport operators. The proposed algorithm is evaluated on the Karcher mean computation on the symmetric positive-definite manifold and the low-rank matrix completion on the Grassmann manifold. In all cases, the proposed algorithm outperforms the state-of-the-art Riemannian batch and stochastic gradient algorithms.

Cite

Text

Kasai et al. "Riemannian Stochastic Quasi-Newton Algorithm with Variance Reduction and Its Convergence Analysis." International Conference on Artificial Intelligence and Statistics, 2018.

Markdown

[Kasai et al. "Riemannian Stochastic Quasi-Newton Algorithm with Variance Reduction and Its Convergence Analysis." International Conference on Artificial Intelligence and Statistics, 2018.](https://mlanthology.org/aistats/2018/kasai2018aistats-riemannian/)

BibTeX

@inproceedings{kasai2018aistats-riemannian,
  title     = {{Riemannian Stochastic Quasi-Newton Algorithm with Variance Reduction and Its Convergence Analysis}},
  author    = {Kasai, Hiroyuki and Sato, Hiroyuki and Mishra, Bamdev},
  booktitle = {International Conference on Artificial Intelligence and Statistics},
  year      = {2018},
  pages     = {269-278},
  url       = {https://mlanthology.org/aistats/2018/kasai2018aistats-riemannian/}
}