Avoiding Overfitting with BP-SOM
Abstract
Overfitting is a well-known problem in the fields of symbolic and connectionist machine learning. It describes the deterioration of generalisation performance of a trained model. In this paper, we investigate the ability of a novel artificial neural network, bp-som, to avoid overfitting. bp-som is a hybrid neural network which combines a multi-layered feed-forward network (mfn) with Kohonen's self-organising maps (soms). During training, supervised back-propagation learning and unsupervised som learning cooperate in finding adequate hidden-layer representations. We show that bp-som outperforms standard backpropagation, and also back-propagation with a weight decay when dealing with the problem of overfitting. In addition, we show that bp-som succeeds in preserving generalisation performance under hidden-unit pruning, where both other methods fail. 1 On avoiding overfitting In machine-learning research, the performance of a trained model is often expressed in its generalisation perfo...
Cite
Text
Weijters et al. "Avoiding Overfitting with BP-SOM." International Joint Conference on Artificial Intelligence, 1997.Markdown
[Weijters et al. "Avoiding Overfitting with BP-SOM." International Joint Conference on Artificial Intelligence, 1997.](https://mlanthology.org/ijcai/1997/weijters1997ijcai-avoiding/)BibTeX
@inproceedings{weijters1997ijcai-avoiding,
title = {{Avoiding Overfitting with BP-SOM}},
author = {Weijters, Ton and van den Herik, H. Jaap and van den Bosch, Antal and Postma, Eric O.},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {1997},
pages = {1140-1145},
url = {https://mlanthology.org/ijcai/1997/weijters1997ijcai-avoiding/}
}