Ordered Subgraph Aggregation Networks
Abstract
Numerous subgraph-enhanced graph neural networks (GNNs) have emerged recently, provably boosting the expressive power of standard (message-passing) GNNs. However, there is a limited understanding of how these approaches relate to each other and to the Weisfeiler-Leman hierarchy. Moreover, current approaches either use all subgraphs of a given size, sample them uniformly at random, or use hand-crafted heuristics instead of learning to select subgraphs in a data-driven manner. Here, we offer a unified way to study such architectures by introducing a theoretical framework and extending the known expressivity results of subgraph-enhanced GNNs. Concretely, we show that increasing subgraph size always increases the expressive power and develop a better understanding of their limitations by relating them to the established $k\mathsf{\text{-}WL}$ hierarchy. In addition, we explore different approaches for learning to sample subgraphs using recent methods for backpropagating through complex discrete probability distributions. Empirically, we study the predictive performance of different subgraph-enhanced GNNs, showing that our data-driven architectures increase prediction accuracy on standard benchmark datasets compared to non-data-driven subgraph-enhanced graph neural networks while reducing computation time.
Cite
Text
Qian et al. "Ordered Subgraph Aggregation Networks." Neural Information Processing Systems, 2022.Markdown
[Qian et al. "Ordered Subgraph Aggregation Networks." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/qian2022neurips-ordered/)BibTeX
@inproceedings{qian2022neurips-ordered,
title = {{Ordered Subgraph Aggregation Networks}},
author = {Qian, Chendi and Rattan, Gaurav and Geerts, Floris and Niepert, Mathias and Morris, Christopher},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/qian2022neurips-ordered/}
}