Automatically Paraphrasing via Sentence Reconstruction and Round-Trip Translation

Abstract

Paraphrase generation plays key roles in NLP tasks such as question answering, machine translation, and information retrieval. In this paper, we propose a novel framework for paraphrase generation. It simultaneously decodes the output sentence using a pretrained wordset-to-sequence model and a round-trip translation model. We evaluate this framework on Quora, WikiAnswers, MSCOCO and Twitter, and show its advantage over previous state-of-the-art unsupervised methods and distantly-supervised methods by significant margins on all datasets. For Quora and WikiAnswers, our framework even performs better than some strongly supervised methods with domain adaptation. Further, we show that the generated paraphrases can be used to augment the training data for machine translation to achieve substantial improvements.

Cite

Text

Guo et al. "Automatically Paraphrasing via Sentence Reconstruction and Round-Trip Translation." International Joint Conference on Artificial Intelligence, 2021. doi:10.24963/IJCAI.2021/525

Markdown

[Guo et al. "Automatically Paraphrasing via Sentence Reconstruction and Round-Trip Translation." International Joint Conference on Artificial Intelligence, 2021.](https://mlanthology.org/ijcai/2021/guo2021ijcai-automatically/) doi:10.24963/IJCAI.2021/525

BibTeX

@inproceedings{guo2021ijcai-automatically,
  title     = {{Automatically Paraphrasing via Sentence Reconstruction and Round-Trip Translation}},
  author    = {Guo, Zilu and Huang, Zhongqiang and Zhu, Kenny Q. and Chen, Guandan and Zhang, Kaibo and Chen, Boxing and Huang, Fei},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {3815-3821},
  doi       = {10.24963/IJCAI.2021/525},
  url       = {https://mlanthology.org/ijcai/2021/guo2021ijcai-automatically/}
}