Quantized Convolutional Neural Networks Through the Lens of Partial Differential Equations

Abstract

Quantization of Convolutional Neural Networks (CNNs) is a common approach to ease the computational burden involved in the deployment of CNNs. However, fixed-point arithmetic is not natural to the type of computations involved in neural networks. In our work, we consider symmetric and stable variants of common CNNs for image classification, and Graph Convolutional Networks (GCNs) for graph node-classification. We demonstrate through several experiments that the property of forward stability preserves the action of a network under different quantization rates, allowing stable quantized networks to behave similarly to their non-quantized counterparts while using fewer parameters. We also find that at times, stability aids in improving accuracy. These properties are of particular interest for sensitive, resource-constrained or real-time applications.

Cite

Text

Ben-Yair et al. "Quantized Convolutional Neural Networks Through the Lens of Partial Differential Equations." NeurIPS 2021 Workshops: DLDE, 2021.

Markdown

[Ben-Yair et al. "Quantized Convolutional Neural Networks Through the Lens of Partial Differential Equations." NeurIPS 2021 Workshops: DLDE, 2021.](https://mlanthology.org/neuripsw/2021/benyair2021neuripsw-quantized/)

BibTeX

@inproceedings{benyair2021neuripsw-quantized,
  title     = {{Quantized Convolutional Neural Networks Through the Lens of Partial Differential Equations}},
  author    = {Ben-Yair, Ido and Eliasof, Moshe and Treister, Eran},
  booktitle = {NeurIPS 2021 Workshops: DLDE},
  year      = {2021},
  url       = {https://mlanthology.org/neuripsw/2021/benyair2021neuripsw-quantized/}
}