Reborn Filters: Pruning Convolutional Neural Networks with Limited Data
Abstract
Channel pruning is effective in compressing the pretrained CNNs for their deployment on low-end edge devices. Most existing methods independently prune some of the original channels and need the complete original dataset to fix the performance drop after pruning. However, due to commercial protection or data privacy, users may only have access to a tiny portion of training examples, which could be insufficient for the performance recovery. In this paper, for pruning with limited data, we propose to use all original filters to directly develop new compact filters, named reborn filters, so that all useful structure priors in the original filters can be well preserved into the pruned networks, alleviating the performance drop accordingly. During training, reborn filters can be easily implemented via 1×1 convolutional layers and then be fused in the inference stage for acceleration. Based on reborn filters, the proposed channel pruning algorithm shows its effectiveness and superiority on extensive experiments.
Cite
Text
Tang et al. "Reborn Filters: Pruning Convolutional Neural Networks with Limited Data." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.6058Markdown
[Tang et al. "Reborn Filters: Pruning Convolutional Neural Networks with Limited Data." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/tang2020aaai-reborn/) doi:10.1609/AAAI.V34I04.6058BibTeX
@inproceedings{tang2020aaai-reborn,
title = {{Reborn Filters: Pruning Convolutional Neural Networks with Limited Data}},
author = {Tang, Yehui and You, Shan and Xu, Chang and Han, Jin and Qian, Chen and Shi, Boxin and Xu, Chao and Zhang, Changshui},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {5972-5980},
doi = {10.1609/AAAI.V34I04.6058},
url = {https://mlanthology.org/aaai/2020/tang2020aaai-reborn/}
}