Faster Ridge Regression via the Subsampled Randomized Hadamard Transform
Abstract
We propose a fast algorithm for ridge regression when the number of features is much larger than the number of observations ($p \gg n$). The standard way to solve ridge regression in this setting works in the dual space and gives a running time of $O(n^2p)$. Our algorithm (SRHT-DRR) runs in time $O(np\log(n))$ and works by preconditioning the design matrix by a Randomized Walsh-Hadamard Transform with a subsequent subsampling of features. We provide risk bounds for our SRHT-DRR algorithm in the fixed design setting and show experimental results on synthetic and real datasets.
Cite
Text
Lu et al. "Faster Ridge Regression via the Subsampled Randomized Hadamard Transform." Neural Information Processing Systems, 2013.Markdown
[Lu et al. "Faster Ridge Regression via the Subsampled Randomized Hadamard Transform." Neural Information Processing Systems, 2013.](https://mlanthology.org/neurips/2013/lu2013neurips-faster/)BibTeX
@inproceedings{lu2013neurips-faster,
title = {{Faster Ridge Regression via the Subsampled Randomized Hadamard Transform}},
author = {Lu, Yichao and Dhillon, Paramveer and Foster, Dean P. and Ungar, Lyle},
booktitle = {Neural Information Processing Systems},
year = {2013},
pages = {369-377},
url = {https://mlanthology.org/neurips/2013/lu2013neurips-faster/}
}