Neural Nets with Superlinear VC-Dimension
Abstract
It has been known for quite a while that the Vapnik-Chervonenkis dimension (VC-dimension) of a feedforward neural net with linear threshold gates is at most O(w · log w), where w is the total number of weights in the neural net. We show in this paper that this bound is in fact asymptotically optimal. More precisely, we exhibit for any depth d ≥ 3 a large class of feedforward neural nets of depth d with w weights that have VC-dimension Ω(w · log w). This lower bound holds even if the inputs are restricted to Boolean values. The proof of this result relies on a new method that allows us to encode more “program-bits” in the weights of a neural net than previously thought possible.
Cite
Text
Maass. "Neural Nets with Superlinear VC-Dimension." Neural Computation, 1994. doi:10.1162/NECO.1994.6.5.877Markdown
[Maass. "Neural Nets with Superlinear VC-Dimension." Neural Computation, 1994.](https://mlanthology.org/neco/1994/maass1994neco-neural/) doi:10.1162/NECO.1994.6.5.877BibTeX
@article{maass1994neco-neural,
title = {{Neural Nets with Superlinear VC-Dimension}},
author = {Maass, Wolfgang},
journal = {Neural Computation},
year = {1994},
pages = {877-884},
doi = {10.1162/NECO.1994.6.5.877},
volume = {6},
url = {https://mlanthology.org/neco/1994/maass1994neco-neural/}
}