Tight Sample Complexity of Large-Margin Learning
Abstract
We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L2 regularization: We introduce the gamma-adapted-dimension, which is a simple function of the spectrum of a distribution's covariance matrix, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the gamma-adapted-dimension of the source distribution. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification. The bounds hold for a rich family of sub-Gaussian distributions.
Cite
Text
Sabato et al. "Tight Sample Complexity of Large-Margin Learning." Neural Information Processing Systems, 2010.Markdown
[Sabato et al. "Tight Sample Complexity of Large-Margin Learning." Neural Information Processing Systems, 2010.](https://mlanthology.org/neurips/2010/sabato2010neurips-tight/)BibTeX
@inproceedings{sabato2010neurips-tight,
title = {{Tight Sample Complexity of Large-Margin Learning}},
author = {Sabato, Sivan and Srebro, Nathan and Tishby, Naftali},
booktitle = {Neural Information Processing Systems},
year = {2010},
pages = {2038-2046},
url = {https://mlanthology.org/neurips/2010/sabato2010neurips-tight/}
}