BCNN: A Binary CNN with All Matrix Ops Quantized to 1 Bit Precision
Abstract
This paper describes a CNN where all CNN style 2D convolution operations that lower to matrix matrix multiplication are fully binary. The network is derived from a common building block structure that is consistent with a constructive proof outline showing that binary neural networks are universal function approximators. 71.24% top 1 accuracy on the 2012 ImageNet validation set was achieved with a 2 step training procedure and implementation strategies optimized for binary operands are provided.
Cite
Text
Redfern et al. "BCNN: A Binary CNN with All Matrix Ops Quantized to 1 Bit Precision." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2021. doi:10.1109/CVPRW53098.2021.00518Markdown
[Redfern et al. "BCNN: A Binary CNN with All Matrix Ops Quantized to 1 Bit Precision." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2021.](https://mlanthology.org/cvprw/2021/redfern2021cvprw-bcnn/) doi:10.1109/CVPRW53098.2021.00518BibTeX
@inproceedings{redfern2021cvprw-bcnn,
title = {{BCNN: A Binary CNN with All Matrix Ops Quantized to 1 Bit Precision}},
author = {Redfern, Arthur J. and Zhu, Lijun and Newquist, Molly K.},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2021},
pages = {4604-4612},
doi = {10.1109/CVPRW53098.2021.00518},
url = {https://mlanthology.org/cvprw/2021/redfern2021cvprw-bcnn/}
}