Pruning Using Parameter and Neuronal Metrics
Abstract
In this article, we introduce a measure of optimality for architecture selection algorithms for neural networks: the distance from the original network to the new network in a metric defined by the probability distributions of all possible networks. We derive two pruning algorithms, one based on a metric in parameter space and the other based on a metric in neuron space, which are closely related to well-known architecture selection algorithms, such as GOBS. Our framework extends the theoretically range of validity of GOBS and therefore can explain results observed in previous experiments. In addition, we give some computational improvements for these algorithms.
Cite
Text
van de Laar and Heskes. "Pruning Using Parameter and Neuronal Metrics." Neural Computation, 1999. doi:10.1162/089976699300016548Markdown
[van de Laar and Heskes. "Pruning Using Parameter and Neuronal Metrics." Neural Computation, 1999.](https://mlanthology.org/neco/1999/vandelaar1999neco-pruning/) doi:10.1162/089976699300016548BibTeX
@article{vandelaar1999neco-pruning,
title = {{Pruning Using Parameter and Neuronal Metrics}},
author = {van de Laar, Piërre and Heskes, Tom},
journal = {Neural Computation},
year = {1999},
pages = {977-993},
doi = {10.1162/089976699300016548},
volume = {11},
url = {https://mlanthology.org/neco/1999/vandelaar1999neco-pruning/}
}