Cluster-Promoting Quantization with Bit-Drop for Minimizing Network Quantization Loss
Abstract
Network quantization, which aims to reduce the bit-lengths of the network weights and activations, has emerged for their deployments to resource-limited devices. Although recent studies have successfully discretized a full-precision network, they still incur large quantization errors after training, thus giving rise to a significant performance gap between a full-precision network and its quantized counterpart. In this work, we propose a novel quantization method for neural networks, Cluster-Promoting Quantization (CPQ) that finds the optimal quantization grids while naturally encouraging the underlying full-precision weights to gather around those quantization grids cohesively during training. This property of CPQ is thanks to our two main ingredients that enable differentiable quantization: i) the use of the categorical distribution designed by a specific probabilistic parametrization in the forward pass and ii) our proposed multi-class straight-through estimator (STE) in the backward pass. Since our second component, multi-class STE, is intrinsically biased, we additionally propose a new bit-drop technique, DropBits, that revises the standard dropout regularization to randomly drop bits instead of neurons. As a natural extension of DropBits, we further introduce the way of learning heterogeneous quantization levels to find proper bit-length for each layer by imposing an additional regularization on DropBits. We experimentally validate our method on various benchmark datasets and network architectures, and also support a new hypothesis for quantization: learning heterogeneous quantization levels outperforms the case using the same but fixed quantization levels from scratch.
Cite
Text
Lee et al. "Cluster-Promoting Quantization with Bit-Drop for Minimizing Network Quantization Loss." International Conference on Computer Vision, 2021. doi:10.1109/ICCV48922.2021.00532Markdown
[Lee et al. "Cluster-Promoting Quantization with Bit-Drop for Minimizing Network Quantization Loss." International Conference on Computer Vision, 2021.](https://mlanthology.org/iccv/2021/lee2021iccv-clusterpromoting/) doi:10.1109/ICCV48922.2021.00532BibTeX
@inproceedings{lee2021iccv-clusterpromoting,
title = {{Cluster-Promoting Quantization with Bit-Drop for Minimizing Network Quantization Loss}},
author = {Lee, Jung Hyun and Yun, Jihun and Hwang, Sung Ju and Yang, Eunho},
booktitle = {International Conference on Computer Vision},
year = {2021},
pages = {5370-5379},
doi = {10.1109/ICCV48922.2021.00532},
url = {https://mlanthology.org/iccv/2021/lee2021iccv-clusterpromoting/}
}