Enhanced Meta-Learning for Cross-Lingual Named Entity Recognition with Minimal Resources

Abstract

For languages with no annotated resources, transferring knowledge from rich-resource languages is an effective solution for named entity recognition (NER). While all existing methods directly transfer from source-learned model to a target language, in this paper, we propose to fine-tune the learned model with a few similar examples given a test case, which could benefit the prediction by leveraging the structural and semantic information conveyed in such similar examples. To this end, we present a meta-learning algorithm to find a good model parameter initialization that could fast adapt to the given test case and propose to construct multiple pseudo-NER tasks for meta-training by computing sentence similarities. To further improve the model's generalization ability across different languages, we introduce a masking scheme and augment the loss function with an additional maximum term during meta-training. We conduct extensive experiments on cross-lingual named entity recognition with minimal resources over five target languages. The results show that our approach significantly outperforms existing state-of-the-art methods across the board.

Cite

Text

Wu et al. "Enhanced Meta-Learning for Cross-Lingual Named Entity Recognition with Minimal Resources." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I05.6466

Markdown

[Wu et al. "Enhanced Meta-Learning for Cross-Lingual Named Entity Recognition with Minimal Resources." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/wu2020aaai-enhanced/) doi:10.1609/AAAI.V34I05.6466

BibTeX

@inproceedings{wu2020aaai-enhanced,
  title     = {{Enhanced Meta-Learning for Cross-Lingual Named Entity Recognition with Minimal Resources}},
  author    = {Wu, Qianhui and Lin, Zijia and Wang, Guoxin and Chen, Hui and Karlsson, Börje F. and Huang, Biqing and Lin, Chin-Yew},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {9274-9281},
  doi       = {10.1609/AAAI.V34I05.6466},
  url       = {https://mlanthology.org/aaai/2020/wu2020aaai-enhanced/}
}