XNAS: Neural Architecture Search with Expert Advice
Abstract
This paper introduces a novel optimization method for differential neural architecture search, based on the theory of prediction with expert advice. Its optimization criterion is well fitted for an architecture-selection, i.e., it minimizes the regret incurred by a sub-optimal selection of operations. Unlike previous search relaxations, that require hard pruning of architectures, our method is designed to dynamically wipe out inferior architectures and enhance superior ones. It achieves an optimal worst-case regret bound and suggests the use of multiple learning-rates, based on the amount of information carried by the backward gradients. Experiments show that our algorithm achieves a strong performance over several image classification datasets. Specifically, it obtains an error rate of 1.6% for CIFAR-10, 23.9% for ImageNet under mobile settings, and achieves state-of-the-art results on three additional datasets.
Cite
Text
Nayman et al. "XNAS: Neural Architecture Search with Expert Advice." Neural Information Processing Systems, 2019.Markdown
[Nayman et al. "XNAS: Neural Architecture Search with Expert Advice." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/nayman2019neurips-xnas/)BibTeX
@inproceedings{nayman2019neurips-xnas,
title = {{XNAS: Neural Architecture Search with Expert Advice}},
author = {Nayman, Niv and Noy, Asaf and Ridnik, Tal and Friedman, Itamar and Jin, Rong and Zelnik, Lihi},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {1977-1987},
url = {https://mlanthology.org/neurips/2019/nayman2019neurips-xnas/}
}