Faster Principal Component Regression and Stable Matrix Chebyshev Approximation
Abstract
We solve principal component regression (PCR), up to a multiplicative accuracy $1+\gamma$, by reducing the problem to $\tilde{O}(\gamma^{-1})$ black-box calls of ridge regression. Therefore, our algorithm does not require any explicit construction of the top principal components, and is suitable for large-scale PCR instances. In contrast, previous result requires $\tilde{O}(\gamma^{-2})$ such black-box calls. We obtain this result by developing a general stable recurrence formula for matrix Chebyshev polynomials, and a degree-optimal polynomial approximation to the matrix sign function. Our techniques may be of independent interests, especially when designing iterative methods.
Cite
Text
Allen-Zhu and Li. "Faster Principal Component Regression and Stable Matrix Chebyshev Approximation." International Conference on Machine Learning, 2017.Markdown
[Allen-Zhu and Li. "Faster Principal Component Regression and Stable Matrix Chebyshev Approximation." International Conference on Machine Learning, 2017.](https://mlanthology.org/icml/2017/allenzhu2017icml-faster/)BibTeX
@inproceedings{allenzhu2017icml-faster,
title = {{Faster Principal Component Regression and Stable Matrix Chebyshev Approximation}},
author = {Allen-Zhu, Zeyuan and Li, Yuanzhi},
booktitle = {International Conference on Machine Learning},
year = {2017},
pages = {107-115},
volume = {70},
url = {https://mlanthology.org/icml/2017/allenzhu2017icml-faster/}
}