Wiener Graph Deconvolutional Network Improves Graph Self-Supervised Learning
Abstract
Graph self-supervised learning (SSL) has been vastly employed to learn representations from unlabeled graphs. Existing methods can be roughly divided into predictive learning and contrastive learning, where the latter one attracts more research attention with better empirical performance. We argue that, however, predictive models weaponed with powerful decoder could achieve comparable or even better representation power than contrastive models. In this work, we propose a Wiener Graph Deconvolutional Network (WGDN), an augmentation-adaptive decoder empowered by graph wiener filter to perform information reconstruction. Theoretical analysis proves the superior reconstruction ability of graph wiener filter. Extensive experimental results on various datasets demonstrate the effectiveness of our approach.
Cite
Text
Cheng et al. "Wiener Graph Deconvolutional Network Improves Graph Self-Supervised Learning." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I6.25870Markdown
[Cheng et al. "Wiener Graph Deconvolutional Network Improves Graph Self-Supervised Learning." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/cheng2023aaai-wiener/) doi:10.1609/AAAI.V37I6.25870BibTeX
@inproceedings{cheng2023aaai-wiener,
title = {{Wiener Graph Deconvolutional Network Improves Graph Self-Supervised Learning}},
author = {Cheng, Jiashun and Li, Man and Li, Jia and Tsung, Fugee},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {7131-7139},
doi = {10.1609/AAAI.V37I6.25870},
url = {https://mlanthology.org/aaai/2023/cheng2023aaai-wiener/}
}