EENAS: An Efficient Evolutionary Algorithm for Neural Architecture Search
Abstract
Neural Architecture Search (NAS) has been widely applied to automatic neural architecture design. Traditional NAS methods often evaluate a large number of architectures, leading to expensive computation overhead. To speed-up architecture search, recent NAS methods try to employ network estimation strategies for guidance of promising architecture selection. In this paper, we have proposed an efficient evolutionary algorithm for NAS, which adapts the most advanced proxy of synthetic signal bases for architecture estimation. Extensive experiments show that our method outperforms state-of-the-art NAS methods, on NAS-Bench-101 search space and NAS-Bench-201 search space (CIFAR-10, CIFAR-100 and ImageNet16-120). Compared with existing works, our method could identify better architectures with greatly reduced search time.
Cite
Text
Jian et al. "EENAS: An Efficient Evolutionary Algorithm for Neural Architecture Search." Proceedings of The 14th Asian Conference on Machine Learning, 2022.Markdown
[Jian et al. "EENAS: An Efficient Evolutionary Algorithm for Neural Architecture Search." Proceedings of The 14th Asian Conference on Machine Learning, 2022.](https://mlanthology.org/acml/2022/jian2022acml-eenas/)BibTeX
@inproceedings{jian2022acml-eenas,
title = {{EENAS: An Efficient Evolutionary Algorithm for Neural Architecture Search}},
author = {Jian, Zheng and Wenran, Han and Ying, Zhang and Shufan, Ji},
booktitle = {Proceedings of The 14th Asian Conference on Machine Learning},
year = {2022},
pages = {1261-1276},
volume = {189},
url = {https://mlanthology.org/acml/2022/jian2022acml-eenas/}
}