Beyond Message Passing: Neural Graph Pattern Machine
Abstract
Graph learning tasks often hinge on identifying key substructure patterns—such as triadic closures in social networks or benzene rings in molecular graphs—that underpin downstream performance. However, most existing graph neural networks (GNNs) rely on message passing, which aggregates local neighborhood information iteratively and struggles to explicitly capture such fundamental motifs, like triangles, $k$-cliques, and rings. This limitation hinders both expressiveness and long-range dependency modeling. In this paper, we introduce the Neural Graph Pattern Machine (GPM), a novel framework that bypasses message passing by learning directly from graph substructures. GPM efficiently extracts, encodes, and prioritizes task-relevant graph patterns, offering greater expressivity and improved ability to capture long-range dependencies. Empirical evaluations across four standard tasks—node classification, link prediction, graph classification, and graph regression—demonstrate that GPM outperforms state-of-the-art baselines. Further analysis reveals that GPM exhibits strong out-of-distribution generalization, desirable scalability, and enhanced interpretability. Code and datasets are available at: https://github.com/Zehong-Wang/GPM.
Cite
Text
Wang et al. "Beyond Message Passing: Neural Graph Pattern Machine." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Wang et al. "Beyond Message Passing: Neural Graph Pattern Machine." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/wang2025icml-beyond/)BibTeX
@inproceedings{wang2025icml-beyond,
title = {{Beyond Message Passing: Neural Graph Pattern Machine}},
author = {Wang, Zehong and Zhang, Zheyuan and Ma, Tianyi and Chawla, Nitesh V and Zhang, Chuxu and Ye, Yanfang},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {65496-65517},
volume = {267},
url = {https://mlanthology.org/icml/2025/wang2025icml-beyond/}
}