Exhaustive Learning

Abstract

Exhaustive exploration of an ensemble of networks is used to model learning and generalization in layered neural networks. A simple Boolean learning problem involving networks with binary weights is numerically solved to obtain the entropy Sm and the average generalization ability Gm as a function of the size m of the training set. Learning curves Gm vs m are shown to depend solely on the distribution of generalization abilities over the ensemble of networks. Such distribution is determined prior to learning, and provides a novel theoretical tool for the prediction of network performance on a specific task.

Cite

Text

Schwartz et al. "Exhaustive Learning." Neural Computation, 1990. doi:10.1162/NECO.1990.2.3.374

Markdown

[Schwartz et al. "Exhaustive Learning." Neural Computation, 1990.](https://mlanthology.org/neco/1990/schwartz1990neco-exhaustive/) doi:10.1162/NECO.1990.2.3.374

BibTeX

@article{schwartz1990neco-exhaustive,
  title     = {{Exhaustive Learning}},
  author    = {Schwartz, Daniel B. and Samalam, Vijay K. and Solla, Sara A. and Denker, John S.},
  journal   = {Neural Computation},
  year      = {1990},
  pages     = {374-385},
  doi       = {10.1162/NECO.1990.2.3.374},
  volume    = {2},
  url       = {https://mlanthology.org/neco/1990/schwartz1990neco-exhaustive/}
}