On the VC Dimension of Bounded Margin Classifiers
Abstract
In this paper we prove a result that is fundamental to the generalization properties of Vapnik's support vector machines and other large margin classifiers. In particular, we prove that the minimum margin over all dichotomies of k ≤ n + 1 points inside a unit ball in R n is maximized when the points form a regular simplex on the unit sphere. We also provide an alternative proof directly in the framework of level fat shattering.
Cite
Text
Hush and Scovel. "On the VC Dimension of Bounded Margin Classifiers." Machine Learning, 2001. doi:10.1023/A:1010971905232Markdown
[Hush and Scovel. "On the VC Dimension of Bounded Margin Classifiers." Machine Learning, 2001.](https://mlanthology.org/mlj/2001/hush2001mlj-vc/) doi:10.1023/A:1010971905232BibTeX
@article{hush2001mlj-vc,
title = {{On the VC Dimension of Bounded Margin Classifiers}},
author = {Hush, Don R. and Scovel, Clint},
journal = {Machine Learning},
year = {2001},
pages = {33-44},
doi = {10.1023/A:1010971905232},
volume = {45},
url = {https://mlanthology.org/mlj/2001/hush2001mlj-vc/}
}