A New Perspective on the Effects of Spectrum in Graph Neural Networks

Abstract

Many improvements on GNNs can be deemed as operations on the spectrum of the underlying graph matrix, which motivates us to directly study the characteristics of the spectrum and their effects on GNN performance. By generalizing most existing GNN architectures, we show that the correlation issue caused by the unsmooth spectrum becomes the obstacle to leveraging more powerful graph filters as well as developing deep architectures, which therefore restricts GNNs’ performance. Inspired by this, we propose the correlation-free architecture which naturally removes the correlation issue among different channels, making it possible to utilize more sophisticated filters within each channel. The final correlation-free architecture with more powerful filters consistently boosts the performance of learning graph representations. Code is available at https://github.com/qslim/gnn-spectrum.

Cite

Text

Yang et al. "A New Perspective on the Effects of Spectrum in Graph Neural Networks." International Conference on Machine Learning, 2022.

Markdown

[Yang et al. "A New Perspective on the Effects of Spectrum in Graph Neural Networks." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/yang2022icml-new/)

BibTeX

@inproceedings{yang2022icml-new,
  title     = {{A New Perspective on the Effects of Spectrum in Graph Neural Networks}},
  author    = {Yang, Mingqi and Shen, Yanming and Li, Rui and Qi, Heng and Zhang, Qiang and Yin, Baocai},
  booktitle = {International Conference on Machine Learning},
  year      = {2022},
  pages     = {25261-25279},
  volume    = {162},
  url       = {https://mlanthology.org/icml/2022/yang2022icml-new/}
}