Fastfood - Computing Hilbert Space Expansions in Loglinear Time

Abstract

Fast nonlinear function classes are crucial for nonparametric estimation, such as in kernel methods. This paper proposes an improvement to random kitchen sinks that offers significantly faster computation in log-linear time without sacrificing accuracy. Furthermore, we show how one may adjust the regularization properties of the kernel simply by changing the spectral distribution of the projection matrix. We provide experimental results which show that even for for moderately small problems we already achieve two orders of magnitude faster computation and three orders of magnitude lower memory footprint.

Cite

Text

Le et al. "Fastfood - Computing Hilbert Space Expansions in Loglinear Time." International Conference on Machine Learning, 2013.

Markdown

[Le et al. "Fastfood - Computing Hilbert Space Expansions in Loglinear Time." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/le2013icml-fastfood/)

BibTeX

@inproceedings{le2013icml-fastfood,
  title     = {{Fastfood - Computing Hilbert Space Expansions in Loglinear Time}},
  author    = {Le, Quoc and Sarlos, Tamas and Smola, Alexander},
  booktitle = {International Conference on Machine Learning},
  year      = {2013},
  pages     = {244-252},
  volume    = {28},
  url       = {https://mlanthology.org/icml/2013/le2013icml-fastfood/}
}