On Oversquashing in Graph Neural Networks Through the Lens of Dynamical Systems

Abstract

A common problem in Message-Passing Neural Networks is oversquashing -- the limited ability to facilitate effective information flow between distant nodes. Oversquashing is attributed to the exponential decay in information transmission as node distances increase. This paper introduces a novel perspective to address oversquashing, leveraging dynamical systems properties of global and local non-dissipativity, that enable the maintenance of a constant information flow rate. We present SWAN, a uniquely parameterized GNN model with antisymmetry both in space and weight domains, as a means to obtain non-dissipativity. Our theoretical analysis asserts that by implementing these properties, SWAN offers an enhanced ability to transmit information over extended distances. Empirical evaluations on synthetic and real-world benchmarks that emphasize long-range interactions validate the theoretical understanding of SWAN, and its ability to mitigate oversquashing.

Cite

Text

Gravina et al. "On Oversquashing in Graph Neural Networks Through the Lens of Dynamical Systems." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I16.33858

Markdown

[Gravina et al. "On Oversquashing in Graph Neural Networks Through the Lens of Dynamical Systems." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/gravina2025aaai-oversquashing/) doi:10.1609/AAAI.V39I16.33858

BibTeX

@inproceedings{gravina2025aaai-oversquashing,
  title     = {{On Oversquashing in Graph Neural Networks Through the Lens of Dynamical Systems}},
  author    = {Gravina, Alessio and Eliasof, Moshe and Gallicchio, Claudio and Bacciu, Davide and Schönlieb, Carola-Bibiane},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {16906-16914},
  doi       = {10.1609/AAAI.V39I16.33858},
  url       = {https://mlanthology.org/aaai/2025/gravina2025aaai-oversquashing/}
}