Asymptotic Universality for Learning Curves of Support Vector Machines

Abstract

Using methods of Statistical Physics, we investigate the rOle of model complexity in learning with support vector machines (SVMs). We show the advantages of using SVMs with kernels of infinite complexity on noisy target rules, which, in contrast to common theoretical beliefs, are found to achieve optimal general(cid:173) ization error although the training error does not converge to the generalization error. Moreover, we find a universal asymptotics of the learning curves which only depend on the target rule but not on the SVM kernel.

Cite

Text

Opper and Urbanczik. "Asymptotic Universality for Learning Curves of Support Vector Machines." Neural Information Processing Systems, 2001.

Markdown

[Opper and Urbanczik. "Asymptotic Universality for Learning Curves of Support Vector Machines." Neural Information Processing Systems, 2001.](https://mlanthology.org/neurips/2001/opper2001neurips-asymptotic/)

BibTeX

@inproceedings{opper2001neurips-asymptotic,
  title     = {{Asymptotic Universality for Learning Curves of Support Vector Machines}},
  author    = {Opper, Manfred and Urbanczik, Robert},
  booktitle = {Neural Information Processing Systems},
  year      = {2001},
  pages     = {479-486},
  url       = {https://mlanthology.org/neurips/2001/opper2001neurips-asymptotic/}
}