Compressing the Activation Maps in Deep Convolutional Neural Networks and Its Regularizing Effect

Abstract

Deep learning has dramatically improved performance in various image analysis applications in the last few years. However, recent deep learning architectures can be very large, with up to hundreds of layers and millions or even billions of model parameters that are impossible to fit into commodity graphics processing units. We propose a novel approach for compressing high-dimensional activation maps, the most memory-consuming part when training modern deep learning architectures. The proposed method can be used to compress the feature maps of a single layer, multiple layers, or the entire network according to specific needs. To this end, we also evaluated three different methods to compress the activation maps: Wavelet Transform, Discrete Cosine Transform, and Simple Thresholding. We performed experiments in two classification tasks for natural images and two semantic segmentation tasks for medical images. Using the proposed method, we could reduce the memory usage for activation maps by up to 95%. Additionally, we show that the proposed method induces a regularization effect that acts on the layer weight gradients.

Cite

Text

Vu et al. "Compressing the Activation Maps in Deep Convolutional Neural Networks and Its Regularizing Effect." Transactions on Machine Learning Research, 2024.

Markdown

[Vu et al. "Compressing the Activation Maps in Deep Convolutional Neural Networks and Its Regularizing Effect." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/vu2024tmlr-compressing/)

BibTeX

@article{vu2024tmlr-compressing,
  title     = {{Compressing the Activation Maps in Deep Convolutional Neural Networks and Its Regularizing Effect}},
  author    = {Vu, Minh Hoang and Garpebring, Anders and Nyholm, Tufve and Löfstedt, Tommy},
  journal   = {Transactions on Machine Learning Research},
  year      = {2024},
  url       = {https://mlanthology.org/tmlr/2024/vu2024tmlr-compressing/}
}