Network Sketching: Exploiting Binary Structure in Deep CNNs
Abstract
Convolutional neural networks (CNNs) with deep architectures have substantially advanced the state-of-the-art in computer vision tasks. However, deep networks are typically resource-intensive and thus difficult to be deployed on mobile devices. Recently, CNNs with binary weights have shown compelling efficiency to the community, whereas the accuracy of such models is usually unsatisfactory in practice. In this paper, we introduce network sketching as a novel technique of pursuing binary-weight CNNs, targeting at more faithful inference and better trade-off for practical applications. Our basic idea is to exploit binary structure directly in pre-trained filter banks and to produce binary-weight models via tensor expansion. The whole process can be treated as a coarse-to-fine model approximation, akin to the pencil drawing steps of outlining and shading. To further speedup the generated models, namely the sketches, we also propose an associative implementation of binary tensor convolutions. Experimental results demonstrate that a proper sketch of AlexNet (or ResNet) outperforms the existing binary-weight models by large margins on the ImageNet large scale classification task, while the committed memory for network parameters only exceeds a little.
Cite
Text
Guo et al. "Network Sketching: Exploiting Binary Structure in Deep CNNs." Conference on Computer Vision and Pattern Recognition, 2017. doi:10.1109/CVPR.2017.430Markdown
[Guo et al. "Network Sketching: Exploiting Binary Structure in Deep CNNs." Conference on Computer Vision and Pattern Recognition, 2017.](https://mlanthology.org/cvpr/2017/guo2017cvpr-network/) doi:10.1109/CVPR.2017.430BibTeX
@inproceedings{guo2017cvpr-network,
title = {{Network Sketching: Exploiting Binary Structure in Deep CNNs}},
author = {Guo, Yiwen and Yao, Anbang and Zhao, Hao and Chen, Yurong},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2017},
doi = {10.1109/CVPR.2017.430},
url = {https://mlanthology.org/cvpr/2017/guo2017cvpr-network/}
}