Representing Long-Range Context for Graph Neural Networks with Global Attention

Abstract

Graph neural networks are powerful architectures for structured datasets. However, current methods struggle to represent long-range dependencies. Scaling the depth or width of GNNs is insufficient to broaden receptive fields as larger GNNs encounter optimization instabilities such as vanishing gradients and representation oversmoothing, while pooling-based approaches have yet to become as universally useful as in computer vision. In this work, we propose the use of Transformer-based self-attention to learn long-range pairwise relationships, with a novel “readout” mechanism to obtain a global graph embedding. Inspired by recent computer vision results that find position-invariant attention performant in learning long-range relationships, our method, which we call GraphTrans, applies a permutation-invariant Transformer module after a standard GNN module. This simple architecture leads to state-of-the-art results on several graph classification tasks, outperforming methods that explicitly encode graph structure. Our results suggest that purely-learning-based approaches without graph structure may be suitable for learning high-level, long-range relationships on graphs. Code for GraphTrans is available at https://github.com/ucbrise/graphtrans.

Cite

Text

Wu et al. "Representing Long-Range Context for Graph Neural Networks with Global Attention." Neural Information Processing Systems, 2021.

Markdown

[Wu et al. "Representing Long-Range Context for Graph Neural Networks with Global Attention." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/wu2021neurips-representing/)

BibTeX

@inproceedings{wu2021neurips-representing,
  title     = {{Representing Long-Range Context for Graph Neural Networks with Global Attention}},
  author    = {Wu, Zhanghao and Jain, Paras and Wright, Matthew and Mirhoseini, Azalia and Gonzalez, Joseph E and Stoica, Ion},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/wu2021neurips-representing/}
}