Neural Machine Translation: A Review

Abstract

The field of machine translation (MT), the automatic translation of written text from one natural language into another, has experienced a major paradigm shift in recent years. Statistical MT, which mainly relies on various count-based models and which used to dominate MT research for decades, has largely been superseded by neural machine translation (NMT), which tackles translation with a single neural network. In this work we will trace back the origins of modern NMT architectures to word and sentence embeddings and earlier examples of the encoder-decoder network family. We will conclude with a short survey of more recent trends in the field.

Cite

Text

Stahlberg. "Neural Machine Translation: A Review." Journal of Artificial Intelligence Research, 2020. doi:10.1613/JAIR.1.12007

Markdown

[Stahlberg. "Neural Machine Translation: A Review." Journal of Artificial Intelligence Research, 2020.](https://mlanthology.org/jair/2020/stahlberg2020jair-neural/) doi:10.1613/JAIR.1.12007

BibTeX

@article{stahlberg2020jair-neural,
  title     = {{Neural Machine Translation: A Review}},
  author    = {Stahlberg, Felix},
  journal   = {Journal of Artificial Intelligence Research},
  year      = {2020},
  pages     = {343-418},
  doi       = {10.1613/JAIR.1.12007},
  volume    = {69},
  url       = {https://mlanthology.org/jair/2020/stahlberg2020jair-neural/}
}