SUMNAS: Supernet with Unbiased Meta-Features for Neural Architecture Search
Abstract
One-shot Neural Architecture Search (NAS) usually constructs an over-parameterized network, which we call a supernet, and typically adopts sharing parameters among the sub-models to improve computational efficiency. One-shot NAS often repeatedly samples sub-models from the supernet and trains them to optimize the shared parameters. However, this training strategy suffers from multi-model forgetting. Training a sampled sub-model overrides the previous knowledge learned by the other sub-models, resulting in an unfair performance evaluation between the sub-models. We propose Supernet with Unbiased Meta-Features for Neural Architecture Search (SUMNAS), a supernet learning strategy based on meta-learning to tackle the knowledge forgetting issue. During the training phase, we explicitly address the multi-model forgetting problem and help the supernet learn unbiased meta-features, independent from the sampled sub-models. Once training is over, sub-models can be instantly compared to get the overall ranking or the best sub-model. Our evaluation on the NAS-Bench-201 and MobileNet-based search space demonstrate that SUMNAS shows improved ranking ability and finds architectures whose performance is on par with existing state-of-the-art NAS algorithms.
Cite
Text
Ha et al. "SUMNAS: Supernet with Unbiased Meta-Features for Neural Architecture Search." International Conference on Learning Representations, 2022.Markdown
[Ha et al. "SUMNAS: Supernet with Unbiased Meta-Features for Neural Architecture Search." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/ha2022iclr-sumnas/)BibTeX
@inproceedings{ha2022iclr-sumnas,
title = {{SUMNAS: Supernet with Unbiased Meta-Features for Neural Architecture Search}},
author = {Ha, Hyeonmin and Kim, Ji-Hoon and Park, Semin and Chun, Byung-Gon},
booktitle = {International Conference on Learning Representations},
year = {2022},
url = {https://mlanthology.org/iclr/2022/ha2022iclr-sumnas/}
}