Training Binary Neural Network Without Batch Normalization for Image Super-Resolution
Abstract
Recently, binary neural network (BNN) based super-resolution (SR) methods have enjoyed initial success in the SR field. However, there is a noticeable performance gap between the binarized model and the full-precision one. Furthermore, the batch normalization (BN) in binary SR networks introduces floating-point calculations, which is unfriendly to low-precision hardwares. Therefore, there is still room for improvement in terms of model performance and efficiency. Focusing on this issue, in this paper, we first explore a novel binary training mechanism based on the feature distribution, allowing us to replace all BN layers with a simple training method. Then, we construct a strong baseline by combining the highlights of recent binarization methods, which already surpasses the state-of-the-arts. Next, to train highly accurate binarized SR model, we also develop a lightweight network architecture and a multi-stage knowledge distillation strategy to enhance the model representation ability. Extensive experiments demonstrate that the proposed method not only presents advantages of lower computation as compared to conventional floating-point networks but outperforms the state-of-the-art binary methods on the standard SR networks.
Cite
Text
Jiang et al. "Training Binary Neural Network Without Batch Normalization for Image Super-Resolution." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I2.16263Markdown
[Jiang et al. "Training Binary Neural Network Without Batch Normalization for Image Super-Resolution." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/jiang2021aaai-training/) doi:10.1609/AAAI.V35I2.16263BibTeX
@inproceedings{jiang2021aaai-training,
title = {{Training Binary Neural Network Without Batch Normalization for Image Super-Resolution}},
author = {Jiang, Xinrui and Wang, Nannan and Xin, Jingwei and Li, Keyu and Yang, Xi and Gao, Xinbo},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2021},
pages = {1700-1707},
doi = {10.1609/AAAI.V35I2.16263},
url = {https://mlanthology.org/aaai/2021/jiang2021aaai-training/}
}