A La Carte - Learning Fast Kernels

Abstract

Kernel methods have great promise for learning rich statistical representations of large modern datasets. However, compared to neural networks, kernel methods have been perceived as lacking in scalability and flexibility. We introduce a family of fast, flexible, lightly parametrized and general purpose kernel learning methods, derived from Fastfood basis function expansions. We provide mechanisms to learn the properties of groups of spectral frequencies in these expansions, which require only O(mlogd) time and O(m) memory, for m basis functions and d input dimensions. We show that the proposed methods can learn a wide class of kernels, outperforming the alternatives in accuracy, speed, and memory consumption.

Cite

Text

Yang et al. "A La Carte - Learning Fast Kernels." International Conference on Artificial Intelligence and Statistics, 2015.

Markdown

[Yang et al. "A La Carte - Learning Fast Kernels." International Conference on Artificial Intelligence and Statistics, 2015.](https://mlanthology.org/aistats/2015/yang2015aistats-la/)

BibTeX

@inproceedings{yang2015aistats-la,
  title     = {{A La Carte - Learning Fast Kernels}},
  author    = {Yang, Zichao and Wilson, Andrew Gordon and Smola, Alexander J. and Song, Le},
  booktitle = {International Conference on Artificial Intelligence and Statistics},
  year      = {2015},
  url       = {https://mlanthology.org/aistats/2015/yang2015aistats-la/}
}