Unleashing the Power of High-Pass Filtering in Continuous Graph Neural Networks

Abstract

Recent Continuous Graph Neural Networks (CGNNs) have attracted great attention due to its merits of infinite depth without oversmoothing. However, most of the existing CGNNs perform low-pass filtering in nature, as they are derived from discrete Laplacian-smoothing based graph neural networks (GNNs). While prior research has shown the promising results of high-pass filtering for node representation learning, particularly on heterophilous graphs, there remains a need to extend it to continuous domain and explore the synergy between two filtering channels. In this paper, by leveraging low-pass and high-pass filtering, we propose a novel dual-channel continuous graph neural network architecture to address this gap. In particular, we introduce a dimension masking method to coordinate the contribution of all low and high pass filtered feature dimensions to node classification. Our aim is to deepen the understanding of the link between high and low filters, unraveling their distinct roles in learning node representations. To evaluate the effectiveness of our framework, we conduct extensive experiments focusing on the node classification task of heterophilous graphs. Our results demonstrate the competitive performance of our approach, showcasing its robustness to oversmoothing.

Cite

Text

Zhang and Li. "Unleashing the Power of High-Pass Filtering in Continuous Graph Neural Networks." Proceedings of the 15th Asian Conference on Machine Learning, 2023.

Markdown

[Zhang and Li. "Unleashing the Power of High-Pass Filtering in Continuous Graph Neural Networks." Proceedings of the 15th Asian Conference on Machine Learning, 2023.](https://mlanthology.org/acml/2023/zhang2023acml-unleashing/)

BibTeX

@inproceedings{zhang2023acml-unleashing,
  title     = {{Unleashing the Power of High-Pass Filtering in Continuous Graph Neural Networks}},
  author    = {Zhang, Acong and Li, Ping},
  booktitle = {Proceedings of the 15th Asian Conference on Machine Learning},
  year      = {2023},
  pages     = {1683-1698},
  volume    = {222},
  url       = {https://mlanthology.org/acml/2023/zhang2023acml-unleashing/}
}