EENA: Efficient Evolution of Neural Architecture
Abstract
Latest algorithms for automatic neural architecture search perform remarkable but are basically directionless in search space and computational expensive in the training of every intermediate architecture. In this paper, we propose a method for efficient architecture search called EENA (Efficient Evolution of Neural Architecture). Due to the elaborately designed mutation and crossover operations, the evolution process can be guided by the information have already been learned. Therefore, less computational effort will be required while the searching and training time can be reduced significantly. On CIFAR-10 classification, EENA using minimal computational resources (0.65 GPU-days) can design highly effective neural architecture which achieves 2.56% test error with 8.47M parameters. Furthermore, the best architecture discovered is also transferable for CIFAR-100.
Cite
Text
Zhu et al. "EENA: Efficient Evolution of Neural Architecture." IEEE/CVF International Conference on Computer Vision Workshops, 2019. doi:10.1109/ICCVW.2019.00238Markdown
[Zhu et al. "EENA: Efficient Evolution of Neural Architecture." IEEE/CVF International Conference on Computer Vision Workshops, 2019.](https://mlanthology.org/iccvw/2019/zhu2019iccvw-eena/) doi:10.1109/ICCVW.2019.00238BibTeX
@inproceedings{zhu2019iccvw-eena,
title = {{EENA: Efficient Evolution of Neural Architecture}},
author = {Zhu, Hui and An, Zhulin and Yang, Chuanguang and Xu, Kaiqiang and Zhao, Erhu and Xu, Yongjun},
booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
year = {2019},
pages = {1891-1899},
doi = {10.1109/ICCVW.2019.00238},
url = {https://mlanthology.org/iccvw/2019/zhu2019iccvw-eena/}
}