GOAT: A Global Transformer on Large-Scale Graphs

Abstract

Graph transformers have been competitive on graph classification tasks, but they fail to outperform Graph Neural Networks (GNNs) on node classification, which is a common task performed on large-scale graphs for industrial applications. Meanwhile, existing GNN architectures are limited in their ability to perform equally well on both homophilious and heterophilious graphs as their inductive biases are generally tailored to only one setting. To address these issues, we propose GOAT, a scalable global graph transformer. In GOAT, each node conceptually attends to all the nodes in the graph and homophily/heterophily relationships can be learnt adaptively from the data. We provide theoretical justification for our approximate global self-attention scheme, and show it to be scalable to large-scale graphs. We demonstrate the competitiveness of GOAT on both heterophilious and homophilious graphs with millions of nodes.

Cite

Text

Kong et al. "GOAT: A Global Transformer on Large-Scale Graphs." International Conference on Machine Learning, 2023.

Markdown

[Kong et al. "GOAT: A Global Transformer on Large-Scale Graphs." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/kong2023icml-goat/)

BibTeX

@inproceedings{kong2023icml-goat,
  title     = {{GOAT: A Global Transformer on Large-Scale Graphs}},
  author    = {Kong, Kezhi and Chen, Jiuhai and Kirchenbauer, John and Ni, Renkun and Bruss, C. Bayan and Goldstein, Tom},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {17375-17390},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/kong2023icml-goat/}
}