Harnessing Language Model for Cross-Heterogeneity Graph Knowledge Transfer

Abstract

Heterogeneous graphs (HGs) that contain various node and edge types are ubiquitous in real-world scenarios. Considering the common label sparsity problem in HGs, some researchers propose to pretrain on source HGs to extract general knowledge and then fine-tune on a target HG for knowledge transfer. However, existing methods often assume that source and target HGs share a single heterogeneity, meaning that they have the same types of nodes and edges, which contradicts the real-world scenarios requiring cross-heterogeneity transfer. Although a recent study has made some preliminary attempts in cross-heterogeneity learning, its definition of general knowledge heavily rely on human knowledge, which lacks flexibility and further leads to a suboptimal transfer. To address the problem, we propose a novel Language Model-enhanced Cross-Heterogeneity learning model, namely LMCH. Specifically, we first design a metapath-based corpus construction method to unify HG representations as languages. The corpora of source HGs are then used to fine-tune a pretrained Language Model (LM), enabling the LM to autonomously extract general knowledge across different HGs. Furthermore, to fully utilize the extensive unlabeled nodes in a few-labeled target HG, we propose an iterative training pipeline with the help of an extra Graph Neural Network (GNN) predictor, enhanced by LM-GNN contrastive alignment at the end of each iteration. Extensive experiments on four real-world datasets have demonstrated the superior performance of LMCH over state-of-the-art methods.

Cite

Text

Yang et al. "Harnessing Language Model for Cross-Heterogeneity Graph Knowledge Transfer." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I12.33421

Markdown

[Yang et al. "Harnessing Language Model for Cross-Heterogeneity Graph Knowledge Transfer." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/yang2025aaai-harnessing/) doi:10.1609/AAAI.V39I12.33421

BibTeX

@inproceedings{yang2025aaai-harnessing,
  title     = {{Harnessing Language Model for Cross-Heterogeneity Graph Knowledge Transfer}},
  author    = {Yang, Jinyu and Wang, Ruijia and Yang, Cheng and Yan, Bo and Zhou, Qimin and Juan, Yang and Shi, Chuan},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {13026-13034},
  doi       = {10.1609/AAAI.V39I12.33421},
  url       = {https://mlanthology.org/aaai/2025/yang2025aaai-harnessing/}
}