Pruning’s Effect on Generalization Through the Lens of Training and Regularization

Abstract

Practitioners frequently observe that pruning improves model generalization. A long-standing hypothesis based on bias-variance trade-off attributes this generalization improvement to model size reduction. However, recent studies on over-parameterization characterize a new model size regime, in which larger models achieve better generalization. Pruning models in this over-parameterized regime leads to a contradiction -- while theory predicts that reducing model size harms generalization, pruning to a range of sparsities nonetheless improves it. Motivated by this contradiction, we re-examine pruning’s effect on generalization empirically.We show that size reduction cannot fully account for the generalization-improving effect of standard pruning algorithms. Instead, we find that pruning leads to better training at specific sparsities, improving the training loss over the dense model. We find that pruning also leads to additional regularization at other sparsities, reducing the accuracy degradation due to noisy examples over the dense model. Pruning extends model training time and reduces model size. These two factors improve training and add regularization respectively. We empirically demonstrate that both factors are essential to fully explaining pruning's impact on generalization.

Cite

Text

Jin et al. "Pruning’s Effect on Generalization Through the Lens of Training and Regularization." Neural Information Processing Systems, 2022.

Markdown

[Jin et al. "Pruning’s Effect on Generalization Through the Lens of Training and Regularization." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/jin2022neurips-prunings/)

BibTeX

@inproceedings{jin2022neurips-prunings,
  title     = {{Pruning’s Effect on Generalization Through the Lens of Training and Regularization}},
  author    = {Jin, Tian and Carbin, Michael and Roy, Dan and Frankle, Jonathan and Dziugaite, Gintare Karolina},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/jin2022neurips-prunings/}
}