SHARP-Distill: A 68$\times$ Faster Recommender System with Hypergraph Neural Networks and Language Models
Abstract
This paper proposes SHARP-Distill (Speedy Hypergraph And Review-based Personalised Distillation), a novel knowledge distillation approach based on the teacher-student framework that combines Hypergraph Neural Networks (HGNNs) with language models to enhance recommendation quality while significantly improving inference time. The teacher model leverages HGNNs to generate user and item embeddings from interaction data, capturing high-order and group relationships, and employing a pre-trained language model to extract rich semantic features from textual reviews. We utilize a contrastive learning mechanism to ensure structural consistency between various representations. The student includes a shallow and lightweight GCN called CompactGCN designed to inherit high-order relationships while reducing computational complexity. Extensive experiments on real-world datasets demonstrate that SHARP-Distill achieves up to 68$\times$ faster inference time compared to HGNN and 40$\times$ faster than LightGCN while maintaining competitive recommendation accuracy.
Cite
Text
Forouzandeh et al. "SHARP-Distill: A 68$\times$ Faster Recommender System with Hypergraph Neural Networks and Language Models." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Forouzandeh et al. "SHARP-Distill: A 68$\times$ Faster Recommender System with Hypergraph Neural Networks and Language Models." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/forouzandeh2025icml-sharpdistill/)BibTeX
@inproceedings{forouzandeh2025icml-sharpdistill,
title = {{SHARP-Distill: A 68$\times$ Faster Recommender System with Hypergraph Neural Networks and Language Models}},
author = {Forouzandeh, Saman and Moradi, Parham and Jalili, Mahdi},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {17452-17488},
volume = {267},
url = {https://mlanthology.org/icml/2025/forouzandeh2025icml-sharpdistill/}
}