Principal Component Projection and Regression in Nearly Linear Time Through Asymmetric SVRG

Abstract

Given a n-by-d data matrix A, principal component projection (PCP) and principal component regression (PCR), i.e. projection and regression restricted to the top-eigenspace of A, are fundamental problems in machine learning, optimization, and numerical analysis. In this paper we provide the first algorithms that solve these problems in nearly linear time for fixed eigenvalue distribution and large n. This improves upon previous methods which had superlinear running times when either the number of top eigenvalues or gap between the eigenspaces were large. We achieve our results by applying rational polynomial approximations to reduce the problem to solving asymmetric linear systems which we solve by a variant of SVRG. We corroborate these findings with preliminary empirical experiments.

Cite

Text

Jin and Sidford. "Principal Component Projection and Regression in Nearly Linear Time Through Asymmetric SVRG." Neural Information Processing Systems, 2019.

Markdown

[Jin and Sidford. "Principal Component Projection and Regression in Nearly Linear Time Through Asymmetric SVRG." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/jin2019neurips-principal/)

BibTeX

@inproceedings{jin2019neurips-principal,
  title     = {{Principal Component Projection and Regression in Nearly Linear Time Through Asymmetric SVRG}},
  author    = {Jin, Yujia and Sidford, Aaron},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {3868-3878},
  url       = {https://mlanthology.org/neurips/2019/jin2019neurips-principal/}
}