On the Importance of Small Coordinate Projections
Abstract
It has been recently shown that sharp generalization bounds can be obtained when the function class from which the algorithm chooses its hypotheses is "small" in the sense that the Rademacher averages of this function class are small. We show that a new more general principle guarantees good generalization bounds. The new principle requires that random coordinate projections of the function class evaluated on random samples are "small" with high probability and that the random class of functions allows symmetrization. As an example, we prove that this geometric property of the function class is exactly the reason why the two lately proposed frameworks, the luckiness (Shawe-Taylor et al., 1998) and the algorithmic luckiness (Herbrich and Williamson, 2002), can be used to establish generalization bounds.
Cite
Text
Mendelson and Philips. "On the Importance of Small Coordinate Projections." Journal of Machine Learning Research, 2004.Markdown
[Mendelson and Philips. "On the Importance of Small Coordinate Projections." Journal of Machine Learning Research, 2004.](https://mlanthology.org/jmlr/2004/mendelson2004jmlr-importance/)BibTeX
@article{mendelson2004jmlr-importance,
title = {{On the Importance of Small Coordinate Projections}},
author = {Mendelson, Shahar and Philips, Petra},
journal = {Journal of Machine Learning Research},
year = {2004},
pages = {219-238},
volume = {5},
url = {https://mlanthology.org/jmlr/2004/mendelson2004jmlr-importance/}
}