DRew: Dynamically Rewired Message Passing with Delay

Abstract

Message passing neural networks (MPNNs) have been shown to suffer from the phenomenon of over-squashing that causes poor performance for tasks relying on long-range interactions. This can be largely attributed to message passing only occurring locally, over a node’s immediate neighbours. Rewiring approaches attempting to make graphs ’more connected’, and supposedly better suited to long-range tasks, often lose the inductive bias provided by distance on the graph since they make distant nodes communicate instantly at every layer. In this paper we propose a framework, applicable to any MPNN architecture, that performs a layer-dependent rewiring to ensure gradual densification of the graph. We also propose a delay mechanism that permits skip connections between nodes depending on the layer and their mutual distance. We validate our approach on several long-range tasks and show that it outperforms graph Transformers and multi-hop MPNNs.

Cite

Text

Gutteridge et al. "DRew: Dynamically Rewired Message Passing with Delay." International Conference on Machine Learning, 2023.

Markdown

[Gutteridge et al. "DRew: Dynamically Rewired Message Passing with Delay." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/gutteridge2023icml-drew/)

BibTeX

@inproceedings{gutteridge2023icml-drew,
  title     = {{DRew: Dynamically Rewired Message Passing with Delay}},
  author    = {Gutteridge, Benjamin and Dong, Xiaowen and Bronstein, Michael M. and Di Giovanni, Francesco},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {12252-12267},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/gutteridge2023icml-drew/}
}