Towards Neural Phrase-Based Machine Translation

Abstract

In this paper, we present Neural Phrase-based Machine Translation (NPMT). Our method explicitly models the phrase structures in output sequences using Sleep-WAke Networks (SWAN), a recently proposed segmentation-based sequence modeling method. To mitigate the monotonic alignment requirement of SWAN, we introduce a new layer to perform (soft) local reordering of input sequences. Different from existing neural machine translation (NMT) approaches, NPMT does not use attention-based decoding mechanisms. Instead, it directly outputs phrases in a sequential order and can decode in linear time. Our experiments show that NPMT achieves superior performances on IWSLT 2014 German-English/English-German and IWSLT 2015 English-Vietnamese machine translation tasks compared with strong NMT baselines. We also observe that our method produces meaningful phrases in output languages.

Cite

Text

Huang et al. "Towards Neural Phrase-Based Machine Translation." International Conference on Learning Representations, 2018.

Markdown

[Huang et al. "Towards Neural Phrase-Based Machine Translation." International Conference on Learning Representations, 2018.](https://mlanthology.org/iclr/2018/huang2018iclr-neural/)

BibTeX

@inproceedings{huang2018iclr-neural,
  title     = {{Towards Neural Phrase-Based Machine Translation}},
  author    = {Huang, Po-Sen and Wang, Chong and Huang, Sitao and Zhou, Dengyong and Deng, Li},
  booktitle = {International Conference on Learning Representations},
  year      = {2018},
  url       = {https://mlanthology.org/iclr/2018/huang2018iclr-neural/}
}