On Graph Neural Networks Versus Graph-Augmented MLPs
Abstract
From the perspectives of expressive power and learning, this work compares multi-layer Graph Neural Networks (GNNs) with a simplified alternative that we call Graph-Augmented Multi-Layer Perceptrons (GA-MLPs), which first augments node features with certain multi-hop operators on the graph and then applies learnable node-wise functions. From the perspective of graph isomorphism testing, we show both theoretically and numerically that GA-MLPs with suitable operators can distinguish almost all non-isomorphic graphs, just like the Weisfeiler-Lehman (WL) test and GNNs. However, by viewing them as node-level functions and examining the equivalence classes they induce on rooted graphs, we prove a separation in expressive power between GA-MLPs and GNNs that grows exponentially in depth. In particular, unlike GNNs, GA-MLPs are unable to count the number of attributed walks. We also demonstrate via community detection experiments that GA-MLPs can be limited by their choice of operator family, whereas GNNs have higher flexibility in learning.
Cite
Text
Chen et al. "On Graph Neural Networks Versus Graph-Augmented MLPs." International Conference on Learning Representations, 2021.Markdown
[Chen et al. "On Graph Neural Networks Versus Graph-Augmented MLPs." International Conference on Learning Representations, 2021.](https://mlanthology.org/iclr/2021/chen2021iclr-graph/)BibTeX
@inproceedings{chen2021iclr-graph,
title = {{On Graph Neural Networks Versus Graph-Augmented MLPs}},
author = {Chen, Lei and Chen, Zhengdao and Bruna, Joan},
booktitle = {International Conference on Learning Representations},
year = {2021},
url = {https://mlanthology.org/iclr/2021/chen2021iclr-graph/}
}