Maximum Expected Likelihood Estimation for Zero-Resource Neural Machine Translation

Abstract

While neural machine translation (NMT) has made remarkable progress in translating a handful of high-resource language pairs recently, parallel corpora are not always available for many zero-resource language pairs. To deal with this problem, we propose an approach to zero-resource NMT via maximum expected likelihood estimation. The basic idea is to maximize the expectation with respect to a pivot-to-source translation model for the intended source-to-target model on a pivot-target parallel corpus. To approximate the expectation, we propose two methods to connect the pivot-to-source and source-to-target models. Experiments on two zero-resource language pairs show that the proposed approach yields substantial gains over baseline methods. We also observe that when trained jointly with the source-to-target model, the pivot-to-source translation model also obtains improvements over independent training.

Cite

Text

Zheng et al. "Maximum Expected Likelihood Estimation for Zero-Resource Neural Machine Translation." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/594

Markdown

[Zheng et al. "Maximum Expected Likelihood Estimation for Zero-Resource Neural Machine Translation." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/zheng2017ijcai-maximum/) doi:10.24963/IJCAI.2017/594

BibTeX

@inproceedings{zheng2017ijcai-maximum,
  title     = {{Maximum Expected Likelihood Estimation for Zero-Resource Neural Machine Translation}},
  author    = {Zheng, Hao and Cheng, Yong and Liu, Yang},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {4251-4257},
  doi       = {10.24963/IJCAI.2017/594},
  url       = {https://mlanthology.org/ijcai/2017/zheng2017ijcai-maximum/}
}