Noisy Riemannian Gradient Descent for Eigenvalue Computation with Application to Inexact Stochastic Recursive Gradient Algorithm

Abstract

We provide a robust convergence analysis of the Riemannian gradient descent algorithm for computing the leading eigenvector of a real symmetric matrix. Our result characterizes the convergence behavior of the algorithm under the noisy updates, where noises can be generated by a stochastic process or could be chosen adversarially. The noisy Riemannian gradient descent has a broad range of applications in machine learning and statistics, e.g., streaming principal component analysis or privacy-preserving spectral analysis. In particular, we demonstrate the usefulness of our convergence bound with a new eigengap-dependent sample complexity of the inexact Riemannian stochastic recursive gradient algorithm, which utilizes mini-batch gradients instead of full gradients in outer loops. Our robust convergence paradigm strictly improves the state-of-the-art sample complexity in terms of the gap dependence.

Cite

Text

Chen et al. "Noisy Riemannian Gradient Descent for Eigenvalue Computation with Application to Inexact Stochastic Recursive Gradient Algorithm." Proceedings of The 14th Asian Conference on Machine Learning, 2022.

Markdown

[Chen et al. "Noisy Riemannian Gradient Descent for Eigenvalue Computation with Application to Inexact Stochastic Recursive Gradient Algorithm." Proceedings of The 14th Asian Conference on Machine Learning, 2022.](https://mlanthology.org/acml/2022/chen2022acml-noisy/)

BibTeX

@inproceedings{chen2022acml-noisy,
  title     = {{Noisy Riemannian Gradient Descent for Eigenvalue Computation with Application to Inexact Stochastic Recursive Gradient Algorithm}},
  author    = {Chen, You-Lin and Xu, Zhiqiang and Li, Ping},
  booktitle = {Proceedings of The 14th Asian Conference on Machine Learning},
  year      = {2022},
  pages     = {201-216},
  volume    = {189},
  url       = {https://mlanthology.org/acml/2022/chen2022acml-noisy/}
}