Persistent Message Passing
Abstract
Graph neural networks (GNNs) are a powerful inductive bias for modelling algorithmic reasoning procedures and data structures. Their prowess was mainly demonstrated on tasks featuring Markovian dynamics, where querying any associated data structure depends only on its latest state. For many tasks of interest, however, it may be highly beneficial to support efficient data structure queries dependent on previous states. This requires tracking the data structure's evolution through time, placing significant pressure on the GNN's latent representations. We introduce Persistent Message Passing (PMP), a mechanism which endows GNNs with capability of querying past state by explicitly persisting it: rather than overwriting node representations, it creates new nodes whenever required. PMP generalises out-of-distribution to more than 2$\times$ larger test inputs on dynamic temporal range queries, significantly outperforming GNNs which overwrite states.
Cite
Text
Strathmann et al. "Persistent Message Passing." ICLR 2021 Workshops: GTRL, 2021.Markdown
[Strathmann et al. "Persistent Message Passing." ICLR 2021 Workshops: GTRL, 2021.](https://mlanthology.org/iclrw/2021/strathmann2021iclrw-persistent/)BibTeX
@inproceedings{strathmann2021iclrw-persistent,
title = {{Persistent Message Passing}},
author = {Strathmann, Heiko and Barekatain, Mohammadamin and Blundell, Charles and Veličković, Petar},
booktitle = {ICLR 2021 Workshops: GTRL},
year = {2021},
url = {https://mlanthology.org/iclrw/2021/strathmann2021iclrw-persistent/}
}