Enhancing Graph Transformers with SNNs and Mutual Information
Abstract
Although the integration of Graph Neural Networks (GNNs) and Transformers has demonstrated promising performance across various graph tasks, it remains computationally expensive. In contrast, brain-inspired Spiking Neural Networks (SNNs) offer an energy-efficient architecture due to their unique spike-based, event-driven paradigm. To address the high computational cost issue of Graph Transformers while maintaining the effectiveness to the maximum, in this paper, we propose a novel framework CSSGT, which leverages both the strength of Transformers and the computational efficiency of SNNs for graph tasks, trained under the graph contrastive learning framework. CSSGT comprises two key components: Mutual Information -based Graph Split (MIGS) and Spike-Driven Graph Attention (SDGA). MIGS is designed for sequential input of SNNs, splitting the graph while maximizing mutual information and minimizing redundancy. SDGA, tailored for graph data, exploits sparse graph convolution and addition operations, achieving low computational energy consumption. Extensive experiments on diverse datasets demonstrate that CSSGT converges within two epochs and outperforms various state-of-the-art models while maintaining low computational cost.
Cite
Text
Wang. "Enhancing Graph Transformers with SNNs and Mutual Information." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2025. doi:10.1007/978-3-032-05981-9_30Markdown
[Wang. "Enhancing Graph Transformers with SNNs and Mutual Information." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2025.](https://mlanthology.org/ecmlpkdd/2025/wang2025ecmlpkdd-enhancing/) doi:10.1007/978-3-032-05981-9_30BibTeX
@inproceedings{wang2025ecmlpkdd-enhancing,
title = {{Enhancing Graph Transformers with SNNs and Mutual Information}},
author = {Wang, Ziyu},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2025},
pages = {511-526},
doi = {10.1007/978-3-032-05981-9_30},
url = {https://mlanthology.org/ecmlpkdd/2025/wang2025ecmlpkdd-enhancing/}
}