FlowerFormer: Empowering Neural Architecture Encoding Using a Flow-Aware Graph Transformer
Abstract
The success of a specific neural network architecture is closely tied to the dataset and task it tackles; there is no one-size-fits-all solution. Thus considerable efforts have been made to quickly and accurately estimate the performances of neural architectures without full training or evaluation for given tasks and datasets. Neural architecture encoding has played a crucial role in the estimation and graphbased methods which treat an architecture as a graph have shown prominent performance. For enhanced representation learning of neural architectures we introduce FlowerFormer a powerful graph transformer that incorporates the information flows within a neural architecture. FlowerFormer consists of two key components: (a) bidirectional asynchronous message passing inspired by the flows; (b) global attention built on flow-based masking. Our extensive experiments demonstrate the superiority of FlowerFormer over existing neural encoding methods and its effectiveness extends beyond computer vision models to include graph neural networks and auto speech recognition models. Our code is available at http://github.com/y0ngjaenius/CVPR2024_FLOWERFormer.
Cite
Text
Hwang et al. "FlowerFormer: Empowering Neural Architecture Encoding Using a Flow-Aware Graph Transformer." Conference on Computer Vision and Pattern Recognition, 2024. doi:10.1109/CVPR52733.2024.00586Markdown
[Hwang et al. "FlowerFormer: Empowering Neural Architecture Encoding Using a Flow-Aware Graph Transformer." Conference on Computer Vision and Pattern Recognition, 2024.](https://mlanthology.org/cvpr/2024/hwang2024cvpr-flowerformer/) doi:10.1109/CVPR52733.2024.00586BibTeX
@inproceedings{hwang2024cvpr-flowerformer,
title = {{FlowerFormer: Empowering Neural Architecture Encoding Using a Flow-Aware Graph Transformer}},
author = {Hwang, Dongyeong and Kim, Hyunju and Kim, Sunwoo and Shin, Kijung},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2024},
pages = {6128-6137},
doi = {10.1109/CVPR52733.2024.00586},
url = {https://mlanthology.org/cvpr/2024/hwang2024cvpr-flowerformer/}
}