Regularization with Dot-Product Kernels

Abstract

In this paper we give necessary and sufficient conditions under which kernels of dot product type k(x, y) = k(x . y) satisfy Mer(cid:173) cer's condition and thus may be used in Support Vector Ma(cid:173) chines (SVM), Regularization Networks (RN) or Gaussian Pro(cid:173) cesses (GP). In particular, we show that if the kernel is analytic (i.e. can be expanded in a Taylor series), all expansion coefficients have to be nonnegative. We give an explicit functional form for the feature map by calculating its eigenfunctions and eigenvalues.

Cite

Text

Smola et al. "Regularization with Dot-Product Kernels." Neural Information Processing Systems, 2000.

Markdown

[Smola et al. "Regularization with Dot-Product Kernels." Neural Information Processing Systems, 2000.](https://mlanthology.org/neurips/2000/smola2000neurips-regularization/)

BibTeX

@inproceedings{smola2000neurips-regularization,
  title     = {{Regularization with Dot-Product Kernels}},
  author    = {Smola, Alex J. and Óvári, Zoltán L. and Williamson, Robert C.},
  booktitle = {Neural Information Processing Systems},
  year      = {2000},
  pages     = {308-314},
  url       = {https://mlanthology.org/neurips/2000/smola2000neurips-regularization/}
}