Improving Subgraph-GNNs via Edge-Level Ego-Network Encodings
Abstract
We present a novel edge-level ego-network encoding for learning on graphs that can boost Message Passing Graph Neural Networks (MP-GNNs) by providing additional node and edge features or extending message-passing formats. The proposed encoding is sufficient to distinguish Strongly Regular Graphs, a family of challenging 3-WL equivalent graphs. We show theoretically that such encoding is more expressive than node-based sub-graph MP-GNNs. In an empirical evaluation on four benchmarks with 10 graph datasets, our results match or improve previous baselines on expressivity, graph classification, graph regression, and proximity tasks---while reducing memory usage by 18.1x in certain real-world settings.
Cite
Text
Alvarez-Gonzalez et al. "Improving Subgraph-GNNs via Edge-Level Ego-Network Encodings." Transactions on Machine Learning Research, 2024.Markdown
[Alvarez-Gonzalez et al. "Improving Subgraph-GNNs via Edge-Level Ego-Network Encodings." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/alvarezgonzalez2024tmlr-improving/)BibTeX
@article{alvarezgonzalez2024tmlr-improving,
title = {{Improving Subgraph-GNNs via Edge-Level Ego-Network Encodings}},
author = {Alvarez-Gonzalez, Nurudin and Kaltenbrunner, Andreas and Gómez, Vicenç},
journal = {Transactions on Machine Learning Research},
year = {2024},
url = {https://mlanthology.org/tmlr/2024/alvarezgonzalez2024tmlr-improving/}
}