PINAT: A Permutation INvariance Augmented Transformer for NAS Predictor

Abstract

Time-consuming performance evaluation is the bottleneck of traditional Neural Architecture Search (NAS) methods. Predictor-based NAS can speed up performance evaluation by directly predicting performance, rather than training a large number of sub-models and then validating their performance. Most predictor-based NAS approaches use a proxy dataset to train model-based predictors efficiently but suffer from performance degradation and generalization problems. We attribute these problems to the poor abilities of existing predictors to character the sub-models' structure, specifically the topology information extraction and the node feature representation of the input graph data. To address these problems, we propose a Transformer-like NAS predictor PINAT, consisting of a Permutation INvariance Augmentation module serving as both token embedding layer and self-attention head, as well as a Laplacian matrix to be the positional encoding. Our design produces more representative features of the encoded architecture and outperforms state-of-the-art NAS predictors on six search spaces: NAS-Bench-101, NAS-Bench-201, DARTS, ProxylessNAS, PPI, and ModelNet. The code is available at https://github.com/ShunLu91/PINAT.

Cite

Text

Lu et al. "PINAT: A Permutation INvariance Augmented Transformer for NAS Predictor." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I7.26076

Markdown

[Lu et al. "PINAT: A Permutation INvariance Augmented Transformer for NAS Predictor." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/lu2023aaai-pinat/) doi:10.1609/AAAI.V37I7.26076

BibTeX

@inproceedings{lu2023aaai-pinat,
  title     = {{PINAT: A Permutation INvariance Augmented Transformer for NAS Predictor}},
  author    = {Lu, Shun and Hu, Yu and Wang, Peihao and Han, Yan and Tan, Jianchao and Li, Jixiang and Yang, Sen and Liu, Ji},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {8957-8965},
  doi       = {10.1609/AAAI.V37I7.26076},
  url       = {https://mlanthology.org/aaai/2023/lu2023aaai-pinat/}
}