Generalization Error of Graph Neural Networks in the Mean-Field Regime

Abstract

This work provides a theoretical framework for assessing the generalization error of graph neural networks in the over-parameterized regime, where the number of parameters surpasses the quantity of data points. We explore two widely utilized types of graph neural networks: graph convolutional neural networks and message passing graph neural networks. Prior to this study, existing bounds on the generalization error in the over-parametrized regime were uninformative, limiting our understanding of over-parameterized network performance. Our novel approach involves deriving upper bounds within the mean-field regime for evaluating the generalization error of these graph neural networks. We establish upper bounds with a convergence rate of $O(1/n)$, where $n$ is the number of graph samples. These upper bounds offer a theoretical assurance of the networks’ performance on unseen data in the challenging over-parameterized regime and overall contribute to our understanding of their performance.

Cite

Text

Aminian et al. "Generalization Error of Graph Neural Networks in the Mean-Field Regime." International Conference on Machine Learning, 2024.

Markdown

[Aminian et al. "Generalization Error of Graph Neural Networks in the Mean-Field Regime." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/aminian2024icml-generalization/)

BibTeX

@inproceedings{aminian2024icml-generalization,
  title     = {{Generalization Error of Graph Neural Networks in the Mean-Field Regime}},
  author    = {Aminian, Gholamali and He, Yixuan and Reinert, Gesine and Szpruch, Lukasz and Cohen, Samuel N.},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {1359-1391},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/aminian2024icml-generalization/}
}