Discrepancy and Uncertainty Aware Denoising Knowledge Distillation for Zero-Shot Cross-Lingual Named Entity Recognition

Abstract

The knowledge distillation-based approaches have recently yielded state-of-the-art (SOTA) results for cross-lingual NER tasks in zero-shot scenarios. These approaches typically employ a teacher network trained with the labelled source (rich-resource) language to infer pseudo-soft labels for the unlabelled target (zero-shot) language, and force a student network to approximate these pseudo labels to achieve knowledge transfer. However, previous works have rarely discussed the issue of pseudo-label noise caused by the source-target language gap, which can mislead the training of the student network and result in negative knowledge transfer. This paper proposes an discrepancy and uncertainty aware Denoising Knowledge Distillation model (DenKD) to tackle this issue. Specifically, DenKD uses a discrepancy-aware denoising representation learning method to optimize the class representations of the target language produced by the teacher network, thus enhancing the quality of pseudo labels and reducing noisy predictions. Further, DenKD employs an uncertainty-aware denoising method to quantify the pseudo-label noise and adjust the focus of the student network on different samples during knowledge distillation, thereby mitigating the noise's adverse effects. We conduct extensive experiments on 28 languages including 4 languages not covered by the pre-trained models, and the results demonstrate the effectiveness of our DenKD.

Cite

Text

Ge et al. "Discrepancy and Uncertainty Aware Denoising Knowledge Distillation for Zero-Shot Cross-Lingual Named Entity Recognition." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I16.29762

Markdown

[Ge et al. "Discrepancy and Uncertainty Aware Denoising Knowledge Distillation for Zero-Shot Cross-Lingual Named Entity Recognition." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/ge2024aaai-discrepancy/) doi:10.1609/AAAI.V38I16.29762

BibTeX

@inproceedings{ge2024aaai-discrepancy,
  title     = {{Discrepancy and Uncertainty Aware Denoising Knowledge Distillation for Zero-Shot Cross-Lingual Named Entity Recognition}},
  author    = {Ge, Ling and Hu, Chunming and Ma, Guanghui and Liu, Jihong and Zhang, Hong},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {18056-18064},
  doi       = {10.1609/AAAI.V38I16.29762},
  url       = {https://mlanthology.org/aaai/2024/ge2024aaai-discrepancy/}
}