Recycling Randomness with Structure for Sublinear Time Kernel Expansions

Abstract

We propose a scheme for recycling Gaussian random vectors into structured matrices to ap- proximate various kernel functions in sublin- ear time via random embeddings. Our frame- work includes the Fastfood construction of Le et al. (2013) as a special case, but also ex- tends to Circulant, Toeplitz and Hankel matri- ces, and the broader family of structured matri- ces that are characterized by the concept of low- displacement rank. We introduce notions of co- herence and graph-theoretic structural constants that control the approximation quality, and prove unbiasedness and low-variance properties of ran- dom feature maps that arise within our frame- work. For the case of low-displacement matri- ces, we show how the degree of structure and randomness can be controlled to reduce statis- tical variance at the cost of increased computa- tion and storage requirements. Empirical results strongly support our theory and justify the use of a broader family of structured matrices for scal- ing up kernel methods using random features.

Cite

Text

Choromanski and Sindhwani. "Recycling Randomness with Structure for Sublinear Time Kernel Expansions." International Conference on Machine Learning, 2016.

Markdown

[Choromanski and Sindhwani. "Recycling Randomness with Structure for Sublinear Time Kernel Expansions." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/choromanski2016icml-recycling/)

BibTeX

@inproceedings{choromanski2016icml-recycling,
  title     = {{Recycling Randomness with Structure for Sublinear Time Kernel Expansions}},
  author    = {Choromanski, Krzysztof and Sindhwani, Vikas},
  booktitle = {International Conference on Machine Learning},
  year      = {2016},
  pages     = {2502-2510},
  volume    = {48},
  url       = {https://mlanthology.org/icml/2016/choromanski2016icml-recycling/}
}