Enhancing Cross-Lingual Transfer by Manifold Mixup

Abstract

Based on large-scale pre-trained multilingual representations, recent cross-lingual transfer methods have achieved impressive transfer performances. However, the performance of target languages still lags far behind the source language. In this paper, our analyses indicate such a performance gap is strongly associated with the cross-lingual representation discrepancy. To achieve better cross-lingual transfer performance, we propose the cross-lingual manifold mixup (X-Mixup) method, which adaptively calibrates the representation discrepancy and gives a compromised representation for target languages. Experiments on the XTREME benchmark show X-Mixup achieves 1.8% performance gains on multiple text understanding tasks, compared with strong baselines, and significantly reduces the cross-lingual representation discrepancy.

Cite

Text

Yang et al. "Enhancing Cross-Lingual Transfer by Manifold Mixup." International Conference on Learning Representations, 2022.

Markdown

[Yang et al. "Enhancing Cross-Lingual Transfer by Manifold Mixup." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/yang2022iclr-enhancing/)

BibTeX

@inproceedings{yang2022iclr-enhancing,
  title     = {{Enhancing Cross-Lingual Transfer by Manifold Mixup}},
  author    = {Yang, Huiyun and Chen, Huadong and Zhou, Hao and Li, Lei},
  booktitle = {International Conference on Learning Representations},
  year      = {2022},
  url       = {https://mlanthology.org/iclr/2022/yang2022iclr-enhancing/}
}