A Dynamical Systems-Inspired Pruning Strategy for Addressing Oversmoothing in Graph Attention Networks
Abstract
Graph Neural Networks (GNNs) face a critical limitation known as oversmoothing, where increasing network depth leads to homogenized node representations, severely compromising their expressiveness. We present a novel dynamical systems perspective on this challenge, revealing oversmoothing as an emergent property of GNNs’ convergence to low-dimensional attractor states. Based on this insight, we introduce DYNAMO-GAT, which combines noise-driven covariance analysis with Anti-Hebbian learning to dynamically prune attention weights, effectively preserving distinct attractor states. We provide theoretical guarantees for DYNAMO-GAT’s effectiveness and demonstrate its superior performance on benchmark datasets, consistently outperforming existing methods while requiring fewer computational resources. This work establishes a fundamental connection between dynamical systems theory and GNN behavior, providing both theoretical insights and practical solutions for deep graph learning.
Cite
Text
Chakraborty et al. "A Dynamical Systems-Inspired Pruning Strategy for Addressing Oversmoothing in Graph Attention Networks." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Chakraborty et al. "A Dynamical Systems-Inspired Pruning Strategy for Addressing Oversmoothing in Graph Attention Networks." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/chakraborty2025icml-dynamical/)BibTeX
@inproceedings{chakraborty2025icml-dynamical,
title = {{A Dynamical Systems-Inspired Pruning Strategy for Addressing Oversmoothing in Graph Attention Networks}},
author = {Chakraborty, Biswadeep and Kumar, Harshit and Mukhopadhyay, Saibal},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {7250-7275},
volume = {267},
url = {https://mlanthology.org/icml/2025/chakraborty2025icml-dynamical/}
}