Is Expressivity Essential for the Predictive Performance of Graph Neural Networks?

Abstract

Motivated by the large amount of research on the expressivity of GNNs, we study the impact of expressivity on the predictive performance of GNNs. By performing knowledge distillation from highly expressive teacher GNNs to less expressive student GNNs, we demonstrate that knowledge distillation reduces the predictive performance gap between teachers and students significantly. As knowledge distillation does not increase the expressivity of the student GNN, it follows that most of this gap in predictive performance cannot be due to expressivity.

Cite

Text

Jogl et al. "Is Expressivity Essential for the Predictive Performance of Graph Neural Networks?." NeurIPS 2024 Workshops: SciForDL, 2024.

Markdown

[Jogl et al. "Is Expressivity Essential for the Predictive Performance of Graph Neural Networks?." NeurIPS 2024 Workshops: SciForDL, 2024.](https://mlanthology.org/neuripsw/2024/jogl2024neuripsw-expressivity/)

BibTeX

@inproceedings{jogl2024neuripsw-expressivity,
  title     = {{Is Expressivity Essential for the Predictive Performance of Graph Neural Networks?}},
  author    = {Jogl, Fabian and Welke, Pascal and Gärtner, Thomas},
  booktitle = {NeurIPS 2024 Workshops: SciForDL},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/jogl2024neuripsw-expressivity/}
}