Large Language Model Meets Graph Neural Network in Knowledge Distillation

Abstract

While Large Language Models (LLMs) show promise for Text-Attributed Graphs (TAGs) learning, their deployment is hindered by computational demands. Graph Neural Networks (GNNs) are efficient but struggle with TAGs' complex semantics. We propose LinguGKD, a novel LLM-to-GNN knowledge distillation framework that enables transferring both local semantic details and global structural information from LLMs to GNNs. First, it introduces TAG-oriented instruction tuning, enhancing LLMs with graph-specific knowledge through carefully designed prompts. Next, it develops a layer-adaptive multi-scale contrastive distillation strategy aligning LLM and GNN features at multiple granularities, from node-level to graph-level. Finally, the distilled GNNs combine the semantic richness of LLMs with the computational efficiency of traditional GNNs. Experiments demonstrate that LinguGKD outperforms existing graph distillation frameworks, the distilled simple GNNs achieve comparable or superior performance to more complex GNNs and teacher LLMs, while maintaining computational efficiency. This work bridges the gap between LLMs and GNNs, facilitating advanced graph learning in resource-constrained environments and providing a framework to leverage ongoing LLM advancements for GNN improvement.

Cite

Text

Hu et al. "Large Language Model Meets Graph Neural Network in Knowledge Distillation." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I16.33901

Markdown

[Hu et al. "Large Language Model Meets Graph Neural Network in Knowledge Distillation." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/hu2025aaai-large/) doi:10.1609/AAAI.V39I16.33901

BibTeX

@inproceedings{hu2025aaai-large,
  title     = {{Large Language Model Meets Graph Neural Network in Knowledge Distillation}},
  author    = {Hu, Shengxiang and Zou, Guobing and Yang, Song and Lin, Shiyi and Gan, Yanglan and Zhang, Bofeng and Chen, Yixin},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {17295-17304},
  doi       = {10.1609/AAAI.V39I16.33901},
  url       = {https://mlanthology.org/aaai/2025/hu2025aaai-large/}
}