Complementary Relation Contrastive Distillation

Abstract

Knowledge distillation aims to transfer representation ability from a teacher model to a student model. Previous approaches focus on either individual representation distillation or inter-sample similarity preservation. While we argue that the inter-sample relation conveys abundant information and needs to be distilled in a more effective way. In this paper, we propose a novel knowledge distillation method, namely Complementary Relation Contrastive Distillation (CRCD), to transfer the structural knowledge from the teacher to the student. Specifically, we estimate the mutual relation in an anchor-based way and distill the anchor-student relation under the supervision of its corresponding anchor-teacher relation. To make it more robust, mutual relations are modeled by two complementary elements: the feature and its gradient. Furthermore, the low bound of mutual information between the anchor-teacher relation distribution and the anchor-student relation distribution is maximized via relation contrastive loss, which can distill both the sample representation and the inter-sample relations. Experiments on different benchmarks demonstrate the effectiveness of our proposed CRCD.

Cite

Text

Zhu et al. "Complementary Relation Contrastive Distillation." Conference on Computer Vision and Pattern Recognition, 2021. doi:10.1109/CVPR46437.2021.00914

Markdown

[Zhu et al. "Complementary Relation Contrastive Distillation." Conference on Computer Vision and Pattern Recognition, 2021.](https://mlanthology.org/cvpr/2021/zhu2021cvpr-complementary/) doi:10.1109/CVPR46437.2021.00914

BibTeX

@inproceedings{zhu2021cvpr-complementary,
  title     = {{Complementary Relation Contrastive Distillation}},
  author    = {Zhu, Jinguo and Tang, Shixiang and Chen, Dapeng and Yu, Shijie and Liu, Yakun and Rong, Mingzhe and Yang, Aijun and Wang, Xiaohua},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2021},
  pages     = {9260-9269},
  doi       = {10.1109/CVPR46437.2021.00914},
  url       = {https://mlanthology.org/cvpr/2021/zhu2021cvpr-complementary/}
}