Lower Bounds on the VC-Dimension of Smoothly Parametrized Function Classes
Abstract
We examine the relationship between the VC-dimension and the number of parameters of a smoothly parametrized function class. We show that the VC-dimension of such a function class is at least k if there exists a k-dimensional differentiable manifold in the parameter space such that each member of the manifold corresponds to a different decision boundary. Using this result, we are able to obtain lower bounds on the VC-dimension proportional to the number of parameters for several function classes including two-layer neural networks with certain smooth activation functions and radial basis functions with a gaussian basis. These lower bounds hold even if the magnitudes of the parameters are restricted to be arbitarily small. In Valiant's probably approximately correct learning framework, this implies that the number of example necessary for learning these function classes is at least linear in the number of parameters.
Cite
Text
Lee et al. "Lower Bounds on the VC-Dimension of Smoothly Parametrized Function Classes." Annual Conference on Computational Learning Theory, 1994. doi:10.1145/180139.181179Markdown
[Lee et al. "Lower Bounds on the VC-Dimension of Smoothly Parametrized Function Classes." Annual Conference on Computational Learning Theory, 1994.](https://mlanthology.org/colt/1994/lee1994colt-lower/) doi:10.1145/180139.181179BibTeX
@inproceedings{lee1994colt-lower,
title = {{Lower Bounds on the VC-Dimension of Smoothly Parametrized Function Classes}},
author = {Lee, Wee Sun and Bartlett, Peter L. and Williamson, Robert C.},
booktitle = {Annual Conference on Computational Learning Theory},
year = {1994},
pages = {362-367},
doi = {10.1145/180139.181179},
url = {https://mlanthology.org/colt/1994/lee1994colt-lower/}
}