Learning Nonoverlapping Perceptron Networks from Examples and Membership Queries
Abstract
We investigate, within the PAC learning model, the problem of learning nonoverlapping perceptron networks (also known as read-once formulas over a weighted threshold basis). These are loop-free neural nets in which each node has only one outgoing weight. We give a polynomial time algorithm that PAC learns any nonverlapping perceptron network using examples and membership queries. The algorithm is able to identify both the architecture and the weight values necessary to represent the function to be learned. Our results shed some light on the effect of the overlap on the complexity of learning in neural networks.
Cite
Text
Hancock et al. "Learning Nonoverlapping Perceptron Networks from Examples and Membership Queries." Machine Learning, 1994. doi:10.1007/BF00993305Markdown
[Hancock et al. "Learning Nonoverlapping Perceptron Networks from Examples and Membership Queries." Machine Learning, 1994.](https://mlanthology.org/mlj/1994/hancock1994mlj-learning/) doi:10.1007/BF00993305BibTeX
@article{hancock1994mlj-learning,
title = {{Learning Nonoverlapping Perceptron Networks from Examples and Membership Queries}},
author = {Hancock, Thomas R. and Golea, Mostefa and Marchand, Mario},
journal = {Machine Learning},
year = {1994},
pages = {161-183},
doi = {10.1007/BF00993305},
volume = {16},
url = {https://mlanthology.org/mlj/1994/hancock1994mlj-learning/}
}