Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees

Abstract

Random Fourier features is one of the most popular techniques for scaling up kernel methods, such as kernel ridge regression. However, despite impressive empirical results, the statistical properties of random Fourier features are still not well understood. In this paper we take steps toward filling this gap. Specifically, we approach random Fourier features from a spectral matrix approximation point of view, give tight bounds on the number of Fourier features required to achieve a spectral approximation, and show how spectral matrix approximation bounds imply statistical guarantees for kernel ridge regression.

Cite

Text

Avron et al. "Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees." International Conference on Machine Learning, 2017.

Markdown

[Avron et al. "Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees." International Conference on Machine Learning, 2017.](https://mlanthology.org/icml/2017/avron2017icml-random/)

BibTeX

@inproceedings{avron2017icml-random,
  title     = {{Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees}},
  author    = {Avron, Haim and Kapralov, Michael and Musco, Cameron and Musco, Christopher and Velingker, Ameya and Zandieh, Amir},
  booktitle = {International Conference on Machine Learning},
  year      = {2017},
  pages     = {253-262},
  volume    = {70},
  url       = {https://mlanthology.org/icml/2017/avron2017icml-random/}
}