Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion

Abstract

In recent years, knowledge graph completion (KGC) models based on pre-trained language model (PLM) have shown promising results. However, the large number of parameters and high computational cost of PLM models pose challenges for their application in downstream tasks. This paper proposes a progressive distillation method based on masked generation features for KGC task, aiming to significantly reduce the complexity of pre-trained models. Specifically, we perform pre-distillation on PLM to obtain high-quality teacher models, and compress the PLM network to obtain multi-grade student models. However, traditional feature distillation suffers from the limitation of having a single representation of information in teacher models. To solve this problem, we propose masked generation of teacher-student features, which contain richer representation information. Furthermore, there is a significant gap in representation ability between teacher and student. Therefore, we design a progressive distillation method to distill student models at each grade level, enabling efficient knowledge transfer from teachers to students. The experimental results demonstrate that the model in the pre-distillation stage surpasses the existing state-of-the-art methods. Furthermore, in the progressive distillation stage, the model significantly reduces the model parameters while maintaining a certain level of performance. Specifically, the model parameters of the lower-grade student model are reduced by 56.7\% compared to the baseline.

Cite

Text

Fan et al. "Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I8.28680

Markdown

[Fan et al. "Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/fan2024aaai-progressive/) doi:10.1609/AAAI.V38I8.28680

BibTeX

@inproceedings{fan2024aaai-progressive,
  title     = {{Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion}},
  author    = {Fan, Cunhang and Chen, Yujie and Xue, Jun and Kong, Yonghui and Tao, Jianhua and Lv, Zhao},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {8380-8388},
  doi       = {10.1609/AAAI.V38I8.28680},
  url       = {https://mlanthology.org/aaai/2024/fan2024aaai-progressive/}
}