Learning Architectures for Binary Networks
Abstract
Backbone architectures of most binary networks are well-known floating point architectures such as the ResNet family. Questioning that the architectures designed for floating point networks would not be the best for binary networks, we propose to search architectures for binary networks (BNAS) by defining a new search space for binary architectures and a novel search objective. Specifically, based on the cell based search method, we define the new search space of binary layer types, design a new cell template, and rediscover the utility of and propose to use the Zeroise layer instead of using it as a placeholder. The novel search objective diversifies early search to learn better performing binary architectures. We show that our proposed method searches architectures with stable training curves despite the quantization error inherent in binary networks. Quantitative analyses demonstrate that our searched architectures outperform the architectures used in state-of-the-art binary networks and outperform or perform on par with state-of-the-art binary networks that employ various techniques other than architectural changes.
Cite
Text
Kim et al. "Learning Architectures for Binary Networks." Proceedings of the European Conference on Computer Vision (ECCV), 2020. doi:10.1007/978-3-030-58610-2_34Markdown
[Kim et al. "Learning Architectures for Binary Networks." Proceedings of the European Conference on Computer Vision (ECCV), 2020.](https://mlanthology.org/eccv/2020/kim2020eccv-learning-a/) doi:10.1007/978-3-030-58610-2_34BibTeX
@inproceedings{kim2020eccv-learning-a,
title = {{Learning Architectures for Binary Networks}},
author = {Kim, Dahyun and Singh, Kunal Pratap and Choi, Jonghyun},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2020},
doi = {10.1007/978-3-030-58610-2_34},
url = {https://mlanthology.org/eccv/2020/kim2020eccv-learning-a/}
}