Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks
Abstract
Our study reveals new theoretical insights into over-smoothing and feature over-correlation in graph neural networks. Specifically, we demonstrate that with increased depth, node representations become dominated by a low-dimensional subspace that depends on the aggregation function but not on the feature transformations. For all aggregation functions, the rank of the node representations collapses, resulting in over-smoothing for particular aggregation functions. Our study emphasizes the importance for future research to focus on rank collapse rather than over-smoothing. Guided by our theory, we propose a sum of Kronecker products as a beneficial property that provably prevents over-smoothing, over-correlation, and rank collapse. We empirically demonstrate the shortcomings of existing models in fitting target functions of node classification tasks.
Cite
Text
Roth and Liebig. "Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks." Proceedings of the Second Learning on Graphs Conference, 2023.Markdown
[Roth and Liebig. "Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks." Proceedings of the Second Learning on Graphs Conference, 2023.](https://mlanthology.org/log/2023/roth2023log-rank/)BibTeX
@inproceedings{roth2023log-rank,
title = {{Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks}},
author = {Roth, Andreas and Liebig, Thomas},
booktitle = {Proceedings of the Second Learning on Graphs Conference},
year = {2023},
pages = {35:1-35:23},
volume = {231},
url = {https://mlanthology.org/log/2023/roth2023log-rank/}
}