Matrix Eigen-Decomposition via Doubly Stochastic Riemannian Optimization
Abstract
Matrix eigen-decomposition is a classic and long-standing problem that plays a fundamental role in scientific computing and machine learning. Despite some existing algorithms for this inherently non-convex problem, the study remains inadequate for the need of large data nowadays. To address this gap, we propose a Doubly Stochastic Riemannian Gradient EIGenSolver, DSRG-EIGS, where the double stochasticity comes from the generalization of the stochastic Euclidean gradient ascent and the stochastic Euclidean coordinate ascent to Riemannian manifolds. As a result, it induces a greatly reduced complexity per iteration, enables the algorithm to completely avoid the matrix inversion, and consequently makes it well-suited to large-scale applications. We theoretically analyze its convergence properties and empirically validate it on real-world datasets. Encouraging experimental results demonstrate its advantages over the deterministic counterparts.
Cite
Text
Xu et al. "Matrix Eigen-Decomposition via Doubly Stochastic Riemannian Optimization." International Conference on Machine Learning, 2016.Markdown
[Xu et al. "Matrix Eigen-Decomposition via Doubly Stochastic Riemannian Optimization." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/xu2016icml-matrix/)BibTeX
@inproceedings{xu2016icml-matrix,
title = {{Matrix Eigen-Decomposition via Doubly Stochastic Riemannian Optimization}},
author = {Xu, Zhiqiang and Zhao, Peilin and Cao, Jianneng and Li, Xiaoli},
booktitle = {International Conference on Machine Learning},
year = {2016},
pages = {1660-1669},
volume = {48},
url = {https://mlanthology.org/icml/2016/xu2016icml-matrix/}
}