Linkless Link Prediction via Relational Distillation

Abstract

Graph Neural Networks (GNNs) have shown exceptional performance in the task of link prediction. Despite their effectiveness, the high latency brought by non-trivial neighborhood data dependency limits GNNs in practical deployments. Conversely, the known efficient MLPs are much less effective than GNNs due to the lack of relational knowledge. In this work, to combine the advantages of GNNs and MLPs, we start with exploring direct knowledge distillation (KD) methods for link prediction, i.e., predicted logit-based matching and node representation-based matching. Upon observing direct KD analogs do not perform well for link prediction, we propose a relational KD framework, Linkless Link Prediction (LLP), to distill knowledge for link prediction with MLPs. Unlike simple KD methods that match independent link logits or node representations, LLP distills relational knowledge that is centered around each (anchor) node to the student MLP. Specifically, we propose rank-based matching and distribution-based matching strategies that complement each other. Extensive experiments demonstrate that LLP boosts the link prediction performance of MLPs with significant margins and even outperforms the teacher GNNs on 7 out of 8 benchmarks. LLP also achieves a 70.68x speedup in link prediction inference compared to GNNs on the large-scale OGB dataset.

Cite

Text

Guo et al. "Linkless Link Prediction via Relational Distillation." International Conference on Machine Learning, 2023.

Markdown

[Guo et al. "Linkless Link Prediction via Relational Distillation." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/guo2023icml-linkless/)

BibTeX

@inproceedings{guo2023icml-linkless,
  title     = {{Linkless Link Prediction via Relational Distillation}},
  author    = {Guo, Zhichun and Shiao, William and Zhang, Shichang and Liu, Yozen and Chawla, Nitesh V and Shah, Neil and Zhao, Tong},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {12012-12033},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/guo2023icml-linkless/}
}