Chaitin-Kolmogorov Complexity and Generalization in Neural Networks
Abstract
We present a unified framework for a number of different ways of failing to generalize properly. During learning, sources of random information contaminate the network, effectively augmenting the training data with random information. The complexity of the function computed is therefore increased, and generalization is degraded. We analyze replicated networks, in which a number of identical networks are independently trained on the same data and their results averaged. We conclude that replication almost always results in a decrease in the expected complexity of the network, and that replication therefore increases expected generalization. Simulations confirming the effect are also presented.
Cite
Text
Pearlmutter and Rosenfeld. "Chaitin-Kolmogorov Complexity and Generalization in Neural Networks." Neural Information Processing Systems, 1990.Markdown
[Pearlmutter and Rosenfeld. "Chaitin-Kolmogorov Complexity and Generalization in Neural Networks." Neural Information Processing Systems, 1990.](https://mlanthology.org/neurips/1990/pearlmutter1990neurips-chaitinkolmogorov/)BibTeX
@inproceedings{pearlmutter1990neurips-chaitinkolmogorov,
title = {{Chaitin-Kolmogorov Complexity and Generalization in Neural Networks}},
author = {Pearlmutter, Barak A. and Rosenfeld, Ronald},
booktitle = {Neural Information Processing Systems},
year = {1990},
pages = {925-931},
url = {https://mlanthology.org/neurips/1990/pearlmutter1990neurips-chaitinkolmogorov/}
}