A Note on the Generalization Performance of Kernel Classifiers with Margin
Abstract
We present distribution independent bounds on the generalization misclassification performance of a family of kernel classifiers with margin. Support Vector Machine classifiers (SVM) stem out of this class of machines. The bounds are derived through computations of the Vγ dimension of a family of loss functions where the SVM one belongs to. Bounds that use functions of margin distributions (i.e. functions of the slack variables of SVM) are derived.
Cite
Text
Evgeniou and Pontil. "A Note on the Generalization Performance of Kernel Classifiers with Margin." International Conference on Algorithmic Learning Theory, 2000. doi:10.1007/3-540-40992-0_23Markdown
[Evgeniou and Pontil. "A Note on the Generalization Performance of Kernel Classifiers with Margin." International Conference on Algorithmic Learning Theory, 2000.](https://mlanthology.org/alt/2000/evgeniou2000alt-note/) doi:10.1007/3-540-40992-0_23BibTeX
@inproceedings{evgeniou2000alt-note,
title = {{A Note on the Generalization Performance of Kernel Classifiers with Margin}},
author = {Evgeniou, Theodoros and Pontil, Massimiliano},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {2000},
pages = {306-315},
doi = {10.1007/3-540-40992-0_23},
url = {https://mlanthology.org/alt/2000/evgeniou2000alt-note/}
}