Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation

Abstract

Non-Autoregressive Neural Machine Translation (NAT) achieves significant decoding speedup through generating target words independently and simultaneously. However, in the context of non-autoregressive translation, the word-level cross-entropy loss cannot model the target-side sequential dependency properly, leading to its weak correlation with the translation quality. As a result, NAT tends to generate influent translations with over-translation and under-translation errors. In this paper, we propose to train NAT to minimize the Bag-of-Ngrams (BoN) difference between the model output and the reference sentence. The bag-of-ngrams training objective is differentiable and can be efficiently calculated, which encourages NAT to capture the target-side sequential dependency and correlates well with the translation quality. We validate our approach on three translation tasks and show that our approach largely outperforms the NAT baseline by about 5.0 BLEU scores on WMT14 En↔De and about 2.5 BLEU scores on WMT16 En↔Ro.

Cite

Text

Shao et al. "Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I01.5351

Markdown

[Shao et al. "Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/shao2020aaai-minimizing/) doi:10.1609/AAAI.V34I01.5351

BibTeX

@inproceedings{shao2020aaai-minimizing,
  title     = {{Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation}},
  author    = {Shao, Chenze and Zhang, Jinchao and Feng, Yang and Meng, Fandong and Zhou, Jie},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {198-205},
  doi       = {10.1609/AAAI.V34I01.5351},
  url       = {https://mlanthology.org/aaai/2020/shao2020aaai-minimizing/}
}