Hyperkernels

Abstract

We consider the problem of choosing a kernel suitable for estimation using a Gaussian Process estimator or a Support Vector Machine. A novel solution is presented which involves defining a Reproducing Ker- nel Hilbert Space on the space of kernels itself. By utilizing an analog of the classical representer theorem, the problem of choosing a kernel from a parameterized family of kernels (e.g. of varying width) is reduced to a statistical estimation problem akin to the problem of minimizing a regularized risk functional. Various classical settings for model or kernel selection are special cases of our framework.

Cite

Text

Ong et al. "Hyperkernels." Neural Information Processing Systems, 2002.

Markdown

[Ong et al. "Hyperkernels." Neural Information Processing Systems, 2002.](https://mlanthology.org/neurips/2002/ong2002neurips-hyperkernels/)

BibTeX

@inproceedings{ong2002neurips-hyperkernels,
  title     = {{Hyperkernels}},
  author    = {Ong, Cheng S. and Williamson, Robert C. and Smola, Alex J.},
  booktitle = {Neural Information Processing Systems},
  year      = {2002},
  pages     = {495-502},
  url       = {https://mlanthology.org/neurips/2002/ong2002neurips-hyperkernels/}
}