A Max-Flow Based Approach for Neural Architecture Search
Abstract
Neural Architecture Search (NAS) aims to automatically produce network architectures suitable to specific tasks on given datasets. Unlike previous NAS strategies based on reinforcement learning, genetic algorithm, Bayesian optimization, and differential programming method, we formulate the NAS task as a Max-Flow problem on search space consisting of Directed Acyclic Graph (DAG) and thus propose a novel NAS approach, called MF-NAS, which defines the search space and designs the search strategy in a fully graphic manner. In MF-NAS, parallel edges with capacities are induced by combining different operations, including skip connection, convolutions, and pooling, and the weights and capacities of the parallel edges are updated iteratively during the search process. Moreover, we interpret MF-NAS from the perspective of nonparametric density estimation and show the relationship between the flow of a graph and the corresponding classification accuracy of neural network architecture. We evaluate the competitive efficacy of our proposed MF-NAS across different datasets with different search spaces that are used in DARTS/ENAS and NAS-Bench-201.
Cite
Text
Xue et al. "A Max-Flow Based Approach for Neural Architecture Search." Proceedings of the European Conference on Computer Vision (ECCV), 2022. doi:10.1007/978-3-031-20044-1_39Markdown
[Xue et al. "A Max-Flow Based Approach for Neural Architecture Search." Proceedings of the European Conference on Computer Vision (ECCV), 2022.](https://mlanthology.org/eccv/2022/xue2022eccv-maxflow/) doi:10.1007/978-3-031-20044-1_39BibTeX
@inproceedings{xue2022eccv-maxflow,
title = {{A Max-Flow Based Approach for Neural Architecture Search}},
author = {Xue, Chao and Wang, Xiaoxing and Yan, Junchi and Li, Chun-Guang},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2022},
doi = {10.1007/978-3-031-20044-1_39},
url = {https://mlanthology.org/eccv/2022/xue2022eccv-maxflow/}
}