How to Train the Teacher Model for Effective Knowledge Distillation

Abstract

Recently, it was shown that the role of the teacher in knowledge distillation (KD) is to provide the student with an estimate of the true Bayes conditional probability density (BCPD). Notably, the new findings propose that the student’s error rate can be upper-bounded by the mean squared error (MSE) between the teacher’s output and BCPD. Consequently, to enhance KD efficacy, the teacher should be trained such that its output is close to BCPD in MSE sense. This paper elucidates that training the teacher model with MSE loss equates to minimizing the MSE between its output and BCPD, aligning with its core responsibility of providing the student with a BCPD estimate closely resembling it in MSE terms. In this respect, through a comprehensive set of experiments, we demonstrate that substituting the conventional teacher trained with cross-entropy loss with one trained using MSE loss in state-of-the-art KD methods consistently boosts the student’s accuracy, resulting in improvements of up to 2.6%. The code for this paper is publicly available at: https://github.com/ECCV2024MSE/ ECCV_MSE_Teacher.

Cite

Text

Hamidi et al. "How to Train the Teacher Model for Effective Knowledge Distillation." Proceedings of the European Conference on Computer Vision (ECCV), 2024. doi:10.1007/978-3-031-73024-5_1

Markdown

[Hamidi et al. "How to Train the Teacher Model for Effective Knowledge Distillation." Proceedings of the European Conference on Computer Vision (ECCV), 2024.](https://mlanthology.org/eccv/2024/hamidi2024eccv-train/) doi:10.1007/978-3-031-73024-5_1

BibTeX

@inproceedings{hamidi2024eccv-train,
  title     = {{How to Train the Teacher Model for Effective Knowledge Distillation}},
  author    = {Hamidi, Shayan Mohajer and Deng, Xizhen and Tan, Renhao and Ye, Linfeng and Salamah, Ahmed Hussein},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2024},
  doi       = {10.1007/978-3-031-73024-5_1},
  url       = {https://mlanthology.org/eccv/2024/hamidi2024eccv-train/}
}