Meta Back-Translation

Abstract

Back-translation is an effective strategy to improve the performance of Neural Machine Translation~(NMT) by generating pseudo-parallel data. However, several recent works have found that better translation quality in the pseudo-parallel data does not necessarily lead to a better final translation model, while lower-quality but diverse data often yields stronger results instead. In this paper we propose a new way to generate pseudo-parallel data for back-translation that directly optimizes the final model performance. Specifically, we propose a meta-learning framework where the back-translation model learns to match the forward-translation model's gradients on the development data with those on the pseudo-parallel data. In our evaluations in both the standard datasets WMT En-De'14 and WMT En-Fr'14, as well as a multilingual translation setting, our method leads to significant improvements over strong baselines.

Cite

Text

Pham et al. "Meta Back-Translation." International Conference on Learning Representations, 2021.

Markdown

[Pham et al. "Meta Back-Translation." International Conference on Learning Representations, 2021.](https://mlanthology.org/iclr/2021/pham2021iclr-meta/)

BibTeX

@inproceedings{pham2021iclr-meta,
  title     = {{Meta Back-Translation}},
  author    = {Pham, Hieu and Wang, Xinyi and Yang, Yiming and Neubig, Graham},
  booktitle = {International Conference on Learning Representations},
  year      = {2021},
  url       = {https://mlanthology.org/iclr/2021/pham2021iclr-meta/}
}