Gapformer: Graph Transformer with Graph Pooling for Node Classification
Abstract
Graph Transformers (GTs) have proved their advantage in graph-level tasks. However, existing GTs still perform unsatisfactorily on the node classification task due to 1) the overwhelming unrelated information obtained from a vast number of irrelevant distant nodes and 2) the quadratic complexity regarding the number of nodes via the fully connected attention mechanism. In this paper, we present Gapformer, a method for node classification that deeply incorporates Graph Transformer with Graph Pooling. More specifically, Gapformer coarsens the large-scale nodes of a graph into a smaller number of pooling nodes via local or global graph pooling methods, and then computes the attention solely with the pooling nodes rather than all other nodes. In such a manner, the negative influence of the overwhelming unrelated nodes is mitigated while maintaining the long-range information, and the quadratic complexity is reduced to linear complexity with respect to the fixed number of pooling nodes. Extensive experiments on 13 node classification datasets, including homophilic and heterophilic graph datasets, demonstrate the competitive performance of Gapformer over existing Graph Neural Networks and GTs.
Cite
Text
Liu et al. "Gapformer: Graph Transformer with Graph Pooling for Node Classification." International Joint Conference on Artificial Intelligence, 2023. doi:10.24963/IJCAI.2023/244Markdown
[Liu et al. "Gapformer: Graph Transformer with Graph Pooling for Node Classification." International Joint Conference on Artificial Intelligence, 2023.](https://mlanthology.org/ijcai/2023/liu2023ijcai-gapformer/) doi:10.24963/IJCAI.2023/244BibTeX
@inproceedings{liu2023ijcai-gapformer,
title = {{Gapformer: Graph Transformer with Graph Pooling for Node Classification}},
author = {Liu, Chuang and Zhan, Yibing and Ma, Xueqi and Ding, Liang and Tao, Dapeng and Wu, Jia and Hu, Wenbin},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2023},
pages = {2196-2205},
doi = {10.24963/IJCAI.2023/244},
url = {https://mlanthology.org/ijcai/2023/liu2023ijcai-gapformer/}
}