The Cascade-Correlation Learning Architecture
Abstract
Cascade-Correlation is a new architecture and supervised learning algo(cid:173) rithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a min(cid:173) imal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detec(cid:173) tors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network . determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Cite
Text
Fahlman and Lebiere. "The Cascade-Correlation Learning Architecture." Neural Information Processing Systems, 1989.Markdown
[Fahlman and Lebiere. "The Cascade-Correlation Learning Architecture." Neural Information Processing Systems, 1989.](https://mlanthology.org/neurips/1989/fahlman1989neurips-cascadecorrelation/)BibTeX
@inproceedings{fahlman1989neurips-cascadecorrelation,
title = {{The Cascade-Correlation Learning Architecture}},
author = {Fahlman, Scott E. and Lebiere, Christian},
booktitle = {Neural Information Processing Systems},
year = {1989},
pages = {524-532},
url = {https://mlanthology.org/neurips/1989/fahlman1989neurips-cascadecorrelation/}
}