Primphormer: Efficient Graph Transformers with Primal Representations
Abstract
Graph Transformers (GTs) have emerged as a promising approach for graph representation learning. Despite their successes, the quadratic complexity of GTs limits scalability on large graphs due to their pair-wise computations. To fundamentally reduce the computational burden of GTs, we propose a primal-dual framework that interprets the self-attention mechanism on graphs as a dual representation. Based on this framework, we develop Primphormer, an efficient GT that leverages a primal representation with linear complexity. Theoretical analysis reveals that Primphormer serves as a universal approximator for functions on both sequences and graphs, while also retaining its expressive power for distinguishing non-isomorphic graphs. Extensive experiments on various graph benchmarks demonstrate that Primphormer achieves competitive empirical results while maintaining a more user-friendly memory and computational costs.
Cite
Text
He et al. "Primphormer: Efficient Graph Transformers with Primal Representations." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[He et al. "Primphormer: Efficient Graph Transformers with Primal Representations." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/he2025icml-primphormer/)BibTeX
@inproceedings{he2025icml-primphormer,
title = {{Primphormer: Efficient Graph Transformers with Primal Representations}},
author = {He, Mingzhen and Yang, Ruikai and Tian, Hanling and Qiu, Youmei and Huang, Xiaolin},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {22792-22815},
volume = {267},
url = {https://mlanthology.org/icml/2025/he2025icml-primphormer/}
}