Neural Network Pruning with Tukey-Kramer Multiple Comparison Procedure

Abstract

Reducing a neural network's complexity improves the ability of the network to generalize future examples. Like an overfitted regression function, neural networks may miss their target because of the excessive degrees of freedom stored up in unnecessary parameters. Over the past decade, the subject of pruning networks produced nonstatistical algorithms like Skeletonization, Optimal Brain Damage, and Optimal Brain Surgeon as methods to remove connections with the least salience. The method proposed here uses the bootstrap algorithm to estimate the distribution of the model parameter saliences. Statistical multiple comparison procedures are then used to make pruning decisions. We show this method compares well with Optimal Brain Surgeon in terms of ability to prune and the resulting network performance.

Cite

Text

Duckro et al. "Neural Network Pruning with Tukey-Kramer Multiple Comparison Procedure." Neural Computation, 2002. doi:10.1162/089976602753633420

Markdown

[Duckro et al. "Neural Network Pruning with Tukey-Kramer Multiple Comparison Procedure." Neural Computation, 2002.](https://mlanthology.org/neco/2002/duckro2002neco-neural/) doi:10.1162/089976602753633420

BibTeX

@article{duckro2002neco-neural,
  title     = {{Neural Network Pruning with Tukey-Kramer Multiple Comparison Procedure}},
  author    = {Duckro, Donald E. and Quinn, Dennis W. and Iii, Samuel J. Gardner},
  journal   = {Neural Computation},
  year      = {2002},
  pages     = {1149-1168},
  doi       = {10.1162/089976602753633420},
  volume    = {14},
  url       = {https://mlanthology.org/neco/2002/duckro2002neco-neural/}
}