On the Sample Complexity for Nonoverlapping Neural Networks

Abstract

A neural network is said to be nonoverlapping if there is at most one edge outgoing from each node. We investigate the number of examples that a learning algorithm needs when using nonoverlapping neural networks as hypotheses. We derive bounds for this sample complexity in terms of the Vapnik-Chervonenkis dimension. In particular, we consider networks consisting of threshold, sigmoidal and linear gates. We show that the class of nonoverlapping threshold networks and the class of nonoverlapping sigmoidal networks on n inputs both have Vapnik-Chervonenkis dimension Ω(nlog n). This bound is asymptotically tight for the class of nonoverlapping threshold networks. We also present an upper bound for this class where the constants involved are considerably smaller than in a previous calculation. Finally, we argue that the Vapnik-Chervonenkis dimension of nonoverlapping threshold or sigmoidal networks cannot become larger by allowing the nodes to compute linear functions. This sheds some light on a recent result that exhibited neural networks with quadratic Vapnik-Chervonenkis dimension.

Cite

Text

Schmitt. "On the Sample Complexity for Nonoverlapping Neural Networks." Machine Learning, 1999. doi:10.1023/A:1007609822199

Markdown

[Schmitt. "On the Sample Complexity for Nonoverlapping Neural Networks." Machine Learning, 1999.](https://mlanthology.org/mlj/1999/schmitt1999mlj-sample/) doi:10.1023/A:1007609822199

BibTeX

@article{schmitt1999mlj-sample,
  title     = {{On the Sample Complexity for Nonoverlapping Neural Networks}},
  author    = {Schmitt, Michael},
  journal   = {Machine Learning},
  year      = {1999},
  pages     = {131-141},
  doi       = {10.1023/A:1007609822199},
  volume    = {37},
  url       = {https://mlanthology.org/mlj/1999/schmitt1999mlj-sample/}
}