M-NAS: Meta Neural Architecture Search
Abstract
Neural Architecture Search (NAS) has recently outperformed hand-crafted networks in various areas. However, most prevalent NAS methods only focus on a pre-defined task. For a previously unseen task, the architecture is either searched from scratch, which is inefficient, or transferred from the one obtained on some other task, which might be sub-optimal. In this paper, we investigate a previously unexplored problem: whether a universal NAS method exists, such that task-aware architectures can be effectively generated? Towards this problem, we propose Meta Neural Architecture Search (M-NAS). To obtain task-specific architectures, M-NAS adopts a task-aware architecture controller for child model generation. Since optimal weights for different tasks and architectures span diversely, we resort to meta-learning, and learn meta-weights that efficiently adapt to a new task on the corresponding architecture with only several gradient descent steps. Experimental results demonstrate the superiority of M-NAS against a number of competitive baselines on both toy regression and few shot classification problems.
Cite
Text
Wang et al. "M-NAS: Meta Neural Architecture Search." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.6084Markdown
[Wang et al. "M-NAS: Meta Neural Architecture Search." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/wang2020aaai-m/) doi:10.1609/AAAI.V34I04.6084BibTeX
@inproceedings{wang2020aaai-m,
title = {{M-NAS: Meta Neural Architecture Search}},
author = {Wang, Jiaxing and Wu, Jiaxiang and Bai, Haoli and Cheng, Jian},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {6186-6193},
doi = {10.1609/AAAI.V34I04.6084},
url = {https://mlanthology.org/aaai/2020/wang2020aaai-m/}
}