Fast Pruning Using Principal Components
Abstract
We present a new algorithm for eliminating excess parameters and improving network generalization after supervised training. The method, "Principal Components Pruning (PCP)", is based on prin(cid:173) cipal component analysis of the node activations of successive layers of the network. It is simple, cheap to implement, and effective. It requires no network retraining, and does not involve calculating the full Hessian of the cost function. Only the weight and the node activity correlation matrices for each layer of nodes are required. We demonstrate the efficacy of the method on a regression problem using polynomial basis functions, and on an economic time series prediction problem using a two-layer, feedforward network.
Cite
Text
Levin et al. "Fast Pruning Using Principal Components." Neural Information Processing Systems, 1993.Markdown
[Levin et al. "Fast Pruning Using Principal Components." Neural Information Processing Systems, 1993.](https://mlanthology.org/neurips/1993/levin1993neurips-fast/)BibTeX
@inproceedings{levin1993neurips-fast,
title = {{Fast Pruning Using Principal Components}},
author = {Levin, Asriel U. and Leen, Todd K. and Moody, John E.},
booktitle = {Neural Information Processing Systems},
year = {1993},
pages = {35-42},
url = {https://mlanthology.org/neurips/1993/levin1993neurips-fast/}
}