Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search

Abstract

High sensitivity of neural architecture search (NAS) methods against their input such as step-size (i.e., learning rate) and search space prevents practitioners from applying them out-of-the-box to their own problems, albeit its purpose is to automate a part of tuning process. Aiming at a fast, robust, and widely-applicable NAS, we develop a generic optimization framework for NAS. We turn a coupled optimization of connection weights and neural architecture into a differentiable optimization by means of stochastic relaxation. It accepts arbitrary search space (widely-applicable) and enables to employ a gradient-based simultaneous optimization of weights and architecture (fast). We propose a stochastic natural gradient method with an adaptive step-size mechanism built upon our theoretical investigation (robust). Despite its simplicity and no problem-dependent parameter tuning, our method exhibited near state-of-the-art performances with low computational budgets both on image classification and inpainting tasks.

Cite

Text

Akimoto et al. "Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search." International Conference on Machine Learning, 2019.

Markdown

[Akimoto et al. "Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/akimoto2019icml-adaptive/)

BibTeX

@inproceedings{akimoto2019icml-adaptive,
  title     = {{Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search}},
  author    = {Akimoto, Youhei and Shirakawa, Shinichi and Yoshinari, Nozomu and Uchida, Kento and Saito, Shota and Nishida, Kouhei},
  booktitle = {International Conference on Machine Learning},
  year      = {2019},
  pages     = {171-180},
  volume    = {97},
  url       = {https://mlanthology.org/icml/2019/akimoto2019icml-adaptive/}
}