Harmonizing Knowledge Transfer in Neural Network with Unified Distillation
Abstract
Knowledge distillation (KD), known for its ability to transfer knowledge from a cumbersome network (teacher) to a lightweight one (student) without altering the architecture, has been garnering increasing attention. Two primary categories emerge within KD methods: feature-based, focusing on intermediate layers’ features, and logits-based, targeting the final layer’s logits. This paper introduces a novel perspective by leveraging diverse knowledge sources within a unified KD framework. Specifically, we aggregate features from intermediate layers into a comprehensive representation, effectively gathering semantic information from different stages and scales. Subsequently, we predict the distribution parameters from this representation. These steps transform knowledge from the intermediate layers into corresponding distributive forms, thereby allowing for knowledge distillation through a unified distribution constraint at different stages of the network, ensuring the comprehensiveness and coherence of knowledge transfer. Numerous experiments were conducted to validate the effectiveness of the proposed method.
Cite
Text
Huang et al. "Harmonizing Knowledge Transfer in Neural Network with Unified Distillation." Proceedings of the European Conference on Computer Vision (ECCV), 2024. doi:10.1007/978-3-031-73414-4_4Markdown
[Huang et al. "Harmonizing Knowledge Transfer in Neural Network with Unified Distillation." Proceedings of the European Conference on Computer Vision (ECCV), 2024.](https://mlanthology.org/eccv/2024/huang2024eccv-harmonizing/) doi:10.1007/978-3-031-73414-4_4BibTeX
@inproceedings{huang2024eccv-harmonizing,
title = {{Harmonizing Knowledge Transfer in Neural Network with Unified Distillation}},
author = {Huang, Yaomin and Fang, Faming and Yan, Zaoming and Shen, Chaomin and Zhang, Guixu},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2024},
doi = {10.1007/978-3-031-73414-4_4},
url = {https://mlanthology.org/eccv/2024/huang2024eccv-harmonizing/}
}