On the Complexity of Learning the Kernel Matrix

Abstract

We investigate data based procedures for selecting the kernel when learn- ing with Support Vector Machines. We provide generalization error bounds by estimating the Rademacher complexities of the corresponding function classes. In particular we obtain a complexity bound for function classes induced by kernels with given eigenvectors, i.e., we allow to vary the spectrum and keep the eigenvectors fix. This bound is only a loga- rithmic factor bigger than the complexity of the function class induced by a single kernel. However, optimizing the margin over such classes leads to overfitting. We thus propose a suitable way of constraining the class. We use an efficient algorithm to solve the resulting optimization problem, present preliminary experimental results, and compare them to an alignment-based approach.

Cite

Text

Bousquet and Herrmann. "On the Complexity of Learning the Kernel Matrix." Neural Information Processing Systems, 2002.

Markdown

[Bousquet and Herrmann. "On the Complexity of Learning the Kernel Matrix." Neural Information Processing Systems, 2002.](https://mlanthology.org/neurips/2002/bousquet2002neurips-complexity/)

BibTeX

@inproceedings{bousquet2002neurips-complexity,
  title     = {{On the Complexity of Learning the Kernel Matrix}},
  author    = {Bousquet, Olivier and Herrmann, Daniel},
  booktitle = {Neural Information Processing Systems},
  year      = {2002},
  pages     = {415-422},
  url       = {https://mlanthology.org/neurips/2002/bousquet2002neurips-complexity/}
}