When NAS Meets Trees: An Efficient Algorithm for Neural Architecture Search

Abstract

The key challenge in neural architecture search (NAS) is designing how to explore wisely in the huge search space. We propose a new NAS method called TNAS (NAS with trees), which improves search efficiency by exploring only a small number of architectures while also achieving a higher search accuracy. TNAS introduces an architecture tree and a binary operation tree, to factorize the search space and substantially reduce the exploration size. TNAS performs a modified bi-level Breadth-First Search in the proposed trees to discover a high-performance architecture. Impressively, TNAS finds the global optimal architecture on CIFAR-10 with test accuracy of 94.37% in four GPU hours in NAS-Bench-201. The average test accuracy is 94.35%, which outperforms the state-of-the-art. Code is available at: https://github.com/guochengqian/TNAS.

Cite

Text

Qian et al. "When NAS Meets Trees: An Efficient Algorithm for Neural Architecture Search." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2022. doi:10.1109/CVPRW56347.2022.00314

Markdown

[Qian et al. "When NAS Meets Trees: An Efficient Algorithm for Neural Architecture Search." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2022.](https://mlanthology.org/cvprw/2022/qian2022cvprw-nas/) doi:10.1109/CVPRW56347.2022.00314

BibTeX

@inproceedings{qian2022cvprw-nas,
  title     = {{When NAS Meets Trees: An Efficient Algorithm for Neural Architecture Search}},
  author    = {Qian, Guocheng and Zhang, Xuanyang and Li, Guohao and Zhao, Chen and Chen, Yukang and Zhang, Xiangyu and Ghanem, Bernard and Sun, Jian},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2022},
  pages     = {2781-2786},
  doi       = {10.1109/CVPRW56347.2022.00314},
  url       = {https://mlanthology.org/cvprw/2022/qian2022cvprw-nas/}
}