Active Large Language Model-Based Knowledge Distillation for Session-Based Recommendation
Abstract
Large language models (LLMs) provide a promising way for accurate session-based recommendation (SBR), but they demand substantial computational time and memory. Knowledge distillation (KD)-based methods can alleviate these issues by transferring the knowledge to a small student, which trains a student based on the predictions of a cumbersome teacher. However, these methods encounter difficulties for LLM-based KD in SBR. 1) It is expensive to make LLMs predict for all instances in KD. 2) LLMs may make ineffective predictions for some instances in KD, e.g., incorrect predictions for hard instances or similar predictions as existing recommenders for easy instances. In this paper, we propose an active LLM-based KD method in SBR, contributing to sustainable AI. To efficiently distill knowledge from LLMs with limited cost, we propose to extract a small proportion of instances predicted by LLMs. Meanwhile, for a more effective distillation, we propose an active learning strategy to extract instances that are as effective as possible for KD from a theoretical view. Specifically, we first formulate gains based on potential effects (e.g., effective, similar, and incorrect predictions by LLMs) and difficulties (e.g., easy or hard to fit) of instances for KD. Then, we propose to maximize the minimal gains of distillation to find the optimal selection policy for active learning, which can largely avoid extracting ineffective instances in KD. Experiments on real-world datasets show that our method significantly outperforms state-of-the-art methods for SBR.
Cite
Text
Du et al. "Active Large Language Model-Based Knowledge Distillation for Session-Based Recommendation." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I11.33263Markdown
[Du et al. "Active Large Language Model-Based Knowledge Distillation for Session-Based Recommendation." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/du2025aaai-active/) doi:10.1609/AAAI.V39I11.33263BibTeX
@inproceedings{du2025aaai-active,
title = {{Active Large Language Model-Based Knowledge Distillation for Session-Based Recommendation}},
author = {Du, Yingpeng and Sun, Zhu and Wang, Ziyan and Chua, Haoyan and Zhang, Jie and Ong, Yew-Soon},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {11607-11615},
doi = {10.1609/AAAI.V39I11.33263},
url = {https://mlanthology.org/aaai/2025/du2025aaai-active/}
}