Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search
Abstract
Neural architecture search (NAS) has gained immense popularity owing to its ability to automate neural architecture design. A number of training-free metrics are recently proposed to realize NAS without training, hence making NAS more scalable. Despite their competitive empirical performances, a unified theoretical understanding of these training-free metrics is lacking. As a consequence, (a) the relationships among these metrics are unclear, (b) there is no theoretical interpretation for their empirical performances, and (c) there may exist untapped potential in existing training-free NAS, which probably can be unveiled through a unified theoretical understanding. To this end, this paper presents a unified theoretical analysis of gradient-based training-free NAS, which allows us to (a) theoretically study their relationships, (b) theoretically guarantee their generalization performances, and (c) exploit our unified theoretical understanding to develop a novel framework named hybrid NAS (HNAS) which consistently boosts training-free NAS in a principled way. Remarkably, HNAS can enjoy the advantages of both training-free (i.e., the superior search efficiency) and training-based (i.e., the remarkable search effectiveness) NAS, which we have demonstrated through extensive experiments.
Cite
Text
Shu et al. "Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search." Neural Information Processing Systems, 2022.Markdown
[Shu et al. "Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/shu2022neurips-unifying/)BibTeX
@inproceedings{shu2022neurips-unifying,
title = {{Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search}},
author = {Shu, Yao and Dai, Zhongxiang and Wu, Zhaoxuan and Low, Bryan Kian Hsiang},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/shu2022neurips-unifying/}
}