Knowledge Refinery: Learning from Decoupled Label

Abstract

Recently, a variety of regularization techniques have been widely applied in deep neural networks, which mainly focus on the regularization of weight parameters to encourage generalization effectively. Label regularization techniques are also proposed with the motivation of softening the labels while neglecting the relation of classes. Among them, the technique of knowledge distillation proposes to distill the soft label, which contains the knowledge of class relations. However, this technique needs to pre-train an extra cumbersome teacher model. In this paper, we propose a method called Knowledge Refinery (KR), which enables the neural network to learn the relation of classes on-the-fly without the teacher-student training strategy. We propose the definition of decoupled labels, which consist of the original hard label and the residual label. To exhibit the generalization of KR, we evaluate our method in both fields of computer vision and natural language processing. Our empirical results show consistent performance gains under all experimental settings.

Cite

Text

Ding et al. "Knowledge Refinery: Learning from Decoupled Label." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I8.16888

Markdown

[Ding et al. "Knowledge Refinery: Learning from Decoupled Label." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/ding2021aaai-knowledge/) doi:10.1609/AAAI.V35I8.16888

BibTeX

@inproceedings{ding2021aaai-knowledge,
  title     = {{Knowledge Refinery: Learning from Decoupled Label}},
  author    = {Ding, Qianggang and Wu, Sifan and Dai, Tao and Sun, Hao and Guo, Jiadong and Fu, Zhang-Hua and Xia, Shutao},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {7228-7235},
  doi       = {10.1609/AAAI.V35I8.16888},
  url       = {https://mlanthology.org/aaai/2021/ding2021aaai-knowledge/}
}