Mixed Embedding of XLM for Unsupervised Cantonese-Chinese Neural Machine Translation (Student Abstract)
Abstract
Unsupervised Neural Machines Translation is the most ideal method to apply to Cantonese and Chinese translation because parallel data is scarce in this language pair. In this paper, we proposed a method that combined a modified cross-lingual language model and performed layer to layer attention on unsupervised neural machine translation. In our experiments, we observed that our proposed method does improve the Cantonese to Chinese and Chinese to Cantonese translation by 1.088 and 0.394 BLEU scores. We finally developed a web service based on our ideal approach to provide Cantonese to Chinese Translation and vice versa.
Cite
Text
Wong and Tsai. "Mixed Embedding of XLM for Unsupervised Cantonese-Chinese Neural Machine Translation (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I11.21677Markdown
[Wong and Tsai. "Mixed Embedding of XLM for Unsupervised Cantonese-Chinese Neural Machine Translation (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/wong2022aaai-mixed/) doi:10.1609/AAAI.V36I11.21677BibTeX
@inproceedings{wong2022aaai-mixed,
title = {{Mixed Embedding of XLM for Unsupervised Cantonese-Chinese Neural Machine Translation (Student Abstract)}},
author = {Wong, Ka Ming and Tsai, Richard Tzong-Han},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {13081-13082},
doi = {10.1609/AAAI.V36I11.21677},
url = {https://mlanthology.org/aaai/2022/wong2022aaai-mixed/}
}