UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation
Abstract
Most translation tasks among languages belong to the zero-resource translation problem where parallel corpora are unavailable. Multilingual neural machine translation (MNMT) enables one-pass translation using shared semantic space for all languages compared to the two-pass pivot translation but often underperforms the pivot-based method. In this paper, we propose a novel method, named as Unified Multilingual Multiple teacher-student Model for NMT (UM4). Our method unifies source-teacher, target-teacher, and pivot-teacher models to guide the student model for the zero-resource translation. The source teacher and target teacher force the student to learn the direct source-target translation by the distilled knowledge on both source and target sides. The monolingual corpus is further leveraged by the pivot-teacher model to enhance the student model. Experimental results demonstrate that our model of 72 directions significantly outperforms previous methods on the WMT benchmark.
Cite
Text
Yang et al. "UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation." International Joint Conference on Artificial Intelligence, 2022. doi:10.24963/IJCAI.2022/618Markdown
[Yang et al. "UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation." International Joint Conference on Artificial Intelligence, 2022.](https://mlanthology.org/ijcai/2022/yang2022ijcai-um/) doi:10.24963/IJCAI.2022/618BibTeX
@inproceedings{yang2022ijcai-um,
title = {{UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation}},
author = {Yang, Jian and Yin, Yuwei and Ma, Shuming and Zhang, Dongdong and Wu, Shuangzhi and Guo, Hongcheng and Li, Zhoujun and Wei, Furu},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2022},
pages = {4454-4460},
doi = {10.24963/IJCAI.2022/618},
url = {https://mlanthology.org/ijcai/2022/yang2022ijcai-um/}
}