Greener GRASS: Enhancing GNNs with Encoding, Rewiring, and Attention
Abstract
Graph Neural Networks (GNNs) have become important tools for machine learning on graph-structured data. In this paper, we explore the synergistic combination of graph encoding, graph rewiring, and graph attention, by introducing Graph Attention with Stochastic Structures (GRASS), a novel GNN architecture. GRASS utilizes relative random walk probabilities (RRWP) encoding and a novel decomposed variant (D-RRWP) to efficiently capture structural information. It rewires the input graph by superimposing a random regular graph to enhance long-range information propagation. It also employs a novel additive attention mechanism tailored for graph-structured data. Our empirical evaluations demonstrate that GRASS achieves state-of-the-art performance on multiple benchmark datasets, including a 20.3% reduction in mean absolute error on the ZINC dataset.
Cite
Text
Liao and Poczos. "Greener GRASS: Enhancing GNNs with Encoding, Rewiring, and Attention." International Conference on Learning Representations, 2025.Markdown
[Liao and Poczos. "Greener GRASS: Enhancing GNNs with Encoding, Rewiring, and Attention." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/liao2025iclr-greener/)BibTeX
@inproceedings{liao2025iclr-greener,
title = {{Greener GRASS: Enhancing GNNs with Encoding, Rewiring, and Attention}},
author = {Liao, Tongzhou and Poczos, Barnabas},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/liao2025iclr-greener/}
}