Scaling up Exact Neural Network Compression by ReLU Stability
Abstract
We can compress a rectifier network while exactly preserving its underlying functionality with respect to a given input domain if some of its neurons are stable. However, current approaches to determine the stability of neurons with Rectified Linear Unit (ReLU) activations require solving or finding a good approximation to multiple discrete optimization problems. In this work, we introduce an algorithm based on solving a single optimization problem to identify all stable neurons. Our approach is on median 183 times faster than the state-of-art method on CIFAR-10, which allows us to explore exact compression on deeper (5 x 100) and wider (2 x 800) networks within minutes. For classifiers trained under an amount of L1 regularization that does not worsen accuracy, we can remove up to 56% of the connections on the CIFAR-10 dataset. The code is available at the following link, https://github.com/yuxwind/ExactCompression .
Cite
Text
Serra et al. "Scaling up Exact Neural Network Compression by ReLU Stability." Neural Information Processing Systems, 2021.Markdown
[Serra et al. "Scaling up Exact Neural Network Compression by ReLU Stability." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/serra2021neurips-scaling/)BibTeX
@inproceedings{serra2021neurips-scaling,
title = {{Scaling up Exact Neural Network Compression by ReLU Stability}},
author = {Serra, Thiago and Yu, Xin and Kumar, Abhinav and Ramalingam, Srikumar},
booktitle = {Neural Information Processing Systems},
year = {2021},
url = {https://mlanthology.org/neurips/2021/serra2021neurips-scaling/}
}