Minimal Feedforward Parity Networks Using Threshold Gates
Abstract
This article presents preliminary research on the general problem of reducing the number of neurons needed in a neural network so that the network can perform a specific recognition task. We consider a single-hidden-layer feedforward network in which only McCulloch-Pitts units are employed in the hidden layer. We show that if only interconnections between adjacent layers are allowed, the minimum size of the hidden layer required to solve the n-bit parity problem is n when n ≤ 4.
Cite
Text
Fung and Li. "Minimal Feedforward Parity Networks Using Threshold Gates." Neural Computation, 2001. doi:10.1162/089976601300014556Markdown
[Fung and Li. "Minimal Feedforward Parity Networks Using Threshold Gates." Neural Computation, 2001.](https://mlanthology.org/neco/2001/fung2001neco-minimal/) doi:10.1162/089976601300014556BibTeX
@article{fung2001neco-minimal,
title = {{Minimal Feedforward Parity Networks Using Threshold Gates}},
author = {Fung, Hon-Kwok and Li, Leong Kwan},
journal = {Neural Computation},
year = {2001},
pages = {319-326},
doi = {10.1162/089976601300014556},
volume = {13},
url = {https://mlanthology.org/neco/2001/fung2001neco-minimal/}
}