Gaussian and Wishart Hyperkernels

Abstract

We propose a new method for constructing hyperkenels and define two promising special cases that can be computed in closed form. These we call the Gaussian and Wishart hyperkernels. The former is especially attractive in that it has an interpretable regularization scheme reminiscent of that of the Gaussian RBF kernel. We discuss how kernel learning can be used not just for improving the performance of classification and regression methods, but also as a stand-alone algorithm for dimensionality reduction and relational or metric learning.

Cite

Text

Kondor and Jebara. "Gaussian and Wishart Hyperkernels." Neural Information Processing Systems, 2006.

Markdown

[Kondor and Jebara. "Gaussian and Wishart Hyperkernels." Neural Information Processing Systems, 2006.](https://mlanthology.org/neurips/2006/kondor2006neurips-gaussian/)

BibTeX

@inproceedings{kondor2006neurips-gaussian,
  title     = {{Gaussian and Wishart Hyperkernels}},
  author    = {Kondor, Risi and Jebara, Tony},
  booktitle = {Neural Information Processing Systems},
  year      = {2006},
  pages     = {729-736},
  url       = {https://mlanthology.org/neurips/2006/kondor2006neurips-gaussian/}
}