Optimal Brain Compression: A Framework for Accurate Post-Training Quantization and Pruning

Abstract

We consider the problem of model compression for deep neural networks (DNNs) in the challenging one-shot/post-training setting, in which we are given an accurate trained model, and must compress it without any retraining, based only on a small amount of calibration input data. This problem has become popular in view of the emerging software and hardware support for executing models compressed via pruning and/or quantization with speedup, and well-performing solutions have been proposed independently for both compression approaches.In this paper, we introduce a new compression framework which covers both weight pruning and quantization in a unified setting, is time- and space-efficient, and considerably improves upon the practical performance of existing post-training methods. At the technical level, our approach is based on an exact and efficient realization of the classical Optimal Brain Surgeon (OBS) framework of [LeCun, Denker, and Solla, 1990] extended to also cover weight quantization at the scale of modern DNNs. From the practical perspective, our experimental results show that it can improve significantly upon the compression-accuracy trade-offs of existing post-training methods, and that it can enable the accurate compound application of both pruning and quantization in a post-training setting.

Cite

Text

Frantar and Alistarh. "Optimal Brain Compression: A Framework for Accurate Post-Training Quantization and Pruning." Neural Information Processing Systems, 2022.

Markdown

[Frantar and Alistarh. "Optimal Brain Compression: A Framework for Accurate Post-Training Quantization and Pruning." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/frantar2022neurips-optimal/)

BibTeX

@inproceedings{frantar2022neurips-optimal,
  title     = {{Optimal Brain Compression: A Framework for Accurate Post-Training Quantization and Pruning}},
  author    = {Frantar, Elias and Alistarh, Dan},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/frantar2022neurips-optimal/}
}