Vapnik-Chervonenkis Generalization Bounds for Real Valued Neural Networks

Abstract

We show how lower bounds on the generalization ability of feedforward neural nets with real outputs can be derived within a formalism based directly on the concept of VC dimension and Vapnik's theorem on uniform convergence of estimated probabilities.

Cite

Text

Hole. "Vapnik-Chervonenkis Generalization Bounds for Real Valued Neural Networks." Neural Computation, 1996. doi:10.1162/NECO.1996.8.6.1277

Markdown

[Hole. "Vapnik-Chervonenkis Generalization Bounds for Real Valued Neural Networks." Neural Computation, 1996.](https://mlanthology.org/neco/1996/hole1996neco-vapnikchervonenkis/) doi:10.1162/NECO.1996.8.6.1277

BibTeX

@article{hole1996neco-vapnikchervonenkis,
  title     = {{Vapnik-Chervonenkis Generalization Bounds for Real Valued Neural Networks}},
  author    = {Hole, Arne},
  journal   = {Neural Computation},
  year      = {1996},
  pages     = {1277-1299},
  doi       = {10.1162/NECO.1996.8.6.1277},
  volume    = {8},
  url       = {https://mlanthology.org/neco/1996/hole1996neco-vapnikchervonenkis/}
}