Convolutional Neural Network Pruning with Structural Redundancy Reduction

Abstract

Convolutional neural network (CNN) pruning has become one of the most successful network compression approaches in recent years. Existing works on network pruning usually focus on removing the least important filters in the network to achieve compact architectures. In this study, we claim that identifying structural redundancy plays a more essential role than finding unimportant filters, theoretically and empirically. We first statistically model the network pruning problem in a redundancy reduction perspective and find that pruning in the layer(s) with the most structural redundancy outperforms pruning the least important filters across all layers. Based on this finding, we then propose a network pruning approach that identifies structural redundancy of a CNN and prunes filers in the selected layer(s) with the most redundancy. Experiments on various benchmark network architectures and datasets show that our proposed approach significantly outperforms the previous state-of-the-art.

Cite

Text

Wang et al. "Convolutional Neural Network Pruning with Structural Redundancy Reduction." Conference on Computer Vision and Pattern Recognition, 2021. doi:10.1109/CVPR46437.2021.01467

Markdown

[Wang et al. "Convolutional Neural Network Pruning with Structural Redundancy Reduction." Conference on Computer Vision and Pattern Recognition, 2021.](https://mlanthology.org/cvpr/2021/wang2021cvpr-convolutional/) doi:10.1109/CVPR46437.2021.01467

BibTeX

@inproceedings{wang2021cvpr-convolutional,
  title     = {{Convolutional Neural Network Pruning with Structural Redundancy Reduction}},
  author    = {Wang, Zi and Li, Chengcheng and Wang, Xiangyang},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2021},
  pages     = {14913-14922},
  doi       = {10.1109/CVPR46437.2021.01467},
  url       = {https://mlanthology.org/cvpr/2021/wang2021cvpr-convolutional/}
}