Revisiting Iterative Back-Translation from the Perspective of Compositional Generalization
Abstract
Human intelligence exhibits compositional generalization (i.e., the capacity to understand and produce unseen combinations of seen components), but current neural seq2seq models lack such ability. In this paper, we revisit iterative back-translation, a simple yet effective semi-supervised method, to investigate whether and how it can improve compositional generalization. In this work: (1) We first empirically show that iterative back-translation substantially improves the performance on compositional generalization benchmarks (CFQ and SCAN). (2) To understand why iterative back-translation is useful, we carefully examine the performance gains and find that iterative back-translation can increasingly correct errors in pseudo-parallel data. (3) To further encourage this mechanism, we propose curriculum iterative back-translation, which better improves the quality of pseudo-parallel data, thus further improving the performance.
Cite
Text
Guo et al. "Revisiting Iterative Back-Translation from the Perspective of Compositional Generalization." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I9.16930Markdown
[Guo et al. "Revisiting Iterative Back-Translation from the Perspective of Compositional Generalization." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/guo2021aaai-revisiting/) doi:10.1609/AAAI.V35I9.16930BibTeX
@inproceedings{guo2021aaai-revisiting,
title = {{Revisiting Iterative Back-Translation from the Perspective of Compositional Generalization}},
author = {Guo, Yinuo and Zhu, Hualei and Lin, Zeqi and Chen, Bei and Lou, Jian-Guang and Zhang, Dongmei},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2021},
pages = {7601-7609},
doi = {10.1609/AAAI.V35I9.16930},
url = {https://mlanthology.org/aaai/2021/guo2021aaai-revisiting/}
}