A Fast Algorithm for Matrix Eigen-Decomposition

Abstract

We propose a fast stochastic Riemannian gradient eigensolver for a real and symmetric matrix, and prove its local, eigengap-dependent and linear convergence. The fast convergence is brought by deploying the variance reduction technique which was originally developed for Euclidean strongly convex problems. In this paper, this technique is generalized to Riemannian manifolds for solving the geodesically non-convex problem of finding a group of top eigenvectors of such a matrix. We first propose the general variance reduction form of the stochastic Riemannian gradient, giving rise to the stochastic variance reduced Riemannian gradient method (SVRRG). It turns out that the operation of vector transport is necessary in addition to using Riemannian gradients and retraction operations. We then specialize it to the problem in question, resulting in our SVRRGEIGS algorithm.

Cite

Text

Xu et al. "A Fast Algorithm for Matrix Eigen-Decomposition." Conference on Uncertainty in Artificial Intelligence, 2017.

Markdown

[Xu et al. "A Fast Algorithm for Matrix Eigen-Decomposition." Conference on Uncertainty in Artificial Intelligence, 2017.](https://mlanthology.org/uai/2017/xu2017uai-fast/)

BibTeX

@inproceedings{xu2017uai-fast,
  title     = {{A Fast Algorithm for Matrix Eigen-Decomposition}},
  author    = {Xu, Zhiqiang and Ke, Yiping and Gao, Xin},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {2017},
  url       = {https://mlanthology.org/uai/2017/xu2017uai-fast/}
}