Wasserstein Barycenter Matching for Graph Size Generalization of Message Passing Neural Networks

Abstract

Graph size generalization is hard for Message passing neural networks (MPNNs). The graph-level classification performance of MPNNs degrades across various graph sizes. Recently, theoretical studies reveal that a slow uncontrollable convergence rate w.r.t. graph size could adversely affect the size generalization. To address the uncontrollable convergence rate caused by correlations across nodes in the underlying dimensional signal-generating space, we propose to use Wasserstein barycenters as graph-level consensus to combat node-level correlations. Methodologically, we propose a Wasserstein barycenter matching (WBM) layer that represents an input graph by Wasserstein distances between its MPNN-filtered node embeddings versus some learned class-wise barycenters. Theoretically, we show that the convergence rate of an MPNN with a WBM layer is controllable and independent to the dimensionality of the signal-generating space. Thus MPNNs with WBM layers are less susceptible to slow uncontrollable convergence rate and size variations. Empirically, the WBM layer improves the size generalization over vanilla MPNNs with different backbones (e.g., GCN, GIN, and PNA) significantly on real-world graph datasets.

Cite

Text

Chu et al. "Wasserstein Barycenter Matching for Graph Size Generalization of Message Passing Neural Networks." International Conference on Machine Learning, 2023.

Markdown

[Chu et al. "Wasserstein Barycenter Matching for Graph Size Generalization of Message Passing Neural Networks." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/chu2023icml-wasserstein/)

BibTeX

@inproceedings{chu2023icml-wasserstein,
  title     = {{Wasserstein Barycenter Matching for Graph Size Generalization of Message Passing Neural Networks}},
  author    = {Chu, Xu and Jin, Yujie and Wang, Xin and Zhang, Shanghang and Wang, Yasha and Zhu, Wenwu and Mei, Hong},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {6158-6184},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/chu2023icml-wasserstein/}
}