Generalization in Partially Connected Layered Neural Networks

Abstract

We study the learning from examples in a partially connected single layer perceptron and a two-layer network. Partially connected student networks learn from fully connected teacher networks. We study the generalization in the annealed approximation. We consider a single layer perceptron with binary weights. When a student is weakly diluted, there is a first order phase transition from the poor learning to the good learning state similar to that of fully connected perceptron. With a strong dilution, the first order phase transition disappears and the generalization error decreases continuously. We also study learning of a two-layer committee machine with binary weights. Contrary to the perceptron learning, there always exist a first order transition irrespective of dilution. The permutation symmetry is broken at the transition point and the generalization error is reduced to a non-zero minimum value.

Cite

Text

Kwon et al. "Generalization in Partially Connected Layered Neural Networks." Annual Conference on Computational Learning Theory, 1994. doi:10.1145/180139.181178

Markdown

[Kwon et al. "Generalization in Partially Connected Layered Neural Networks." Annual Conference on Computational Learning Theory, 1994.](https://mlanthology.org/colt/1994/kwon1994colt-generalization/) doi:10.1145/180139.181178

BibTeX

@inproceedings{kwon1994colt-generalization,
  title     = {{Generalization in Partially Connected Layered Neural Networks}},
  author    = {Kwon, Kyung-Hoon and Kang, Kukjin and Oh, Jong-Hoon},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {1994},
  pages     = {356-361},
  doi       = {10.1145/180139.181178},
  url       = {https://mlanthology.org/colt/1994/kwon1994colt-generalization/}
}