ExT5: Towards Extreme Multi-Task Scaling for Transfer Learning

Abstract

Despite the recent success of multi-task learning and transfer learning for natural language processing (NLP), few works have systematically studied the effect of scaling up the number of tasks during pre-training. Towards this goal, this paper introduces ExMix (Extreme Mixture): a massive collection of 107 supervised NLP tasks across diverse domains and task-families. Using ExMix, we study the effect of multi-task pre-training at the largest scale to date, and analyze co-training transfer amongst common families of tasks. Through this analysis, we show that manually curating an ideal set of tasks for multi-task pre-training is not straightforward, and that multi-task scaling can vastly improve models on its own. Finally, we propose ExT5: a model pre-trained using a multi-task objective of self-supervised span denoising and supervised ExMix. Via extensive experiments, we show that ExT5 outperforms strong T5 baselines on SuperGLUE, GEM, Rainbow, Closed-Book QA tasks, and several tasks outside of ExMix. ExT5 also significantly improves sample efficiency while pre-training.

Cite

Text

Aribandi et al. "ExT5: Towards Extreme Multi-Task Scaling for Transfer Learning." International Conference on Learning Representations, 2022.

Markdown

[Aribandi et al. "ExT5: Towards Extreme Multi-Task Scaling for Transfer Learning." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/aribandi2022iclr-ext5/)

BibTeX

@inproceedings{aribandi2022iclr-ext5,
  title     = {{ExT5: Towards Extreme Multi-Task Scaling for Transfer Learning}},
  author    = {Aribandi, Vamsi and Tay, Yi and Schuster, Tal and Rao, Jinfeng and Zheng, Huaixiu Steven and Mehta, Sanket Vaibhav and Zhuang, Honglei and Tran, Vinh Q. and Bahri, Dara and Ni, Jianmo and Gupta, Jai and Hui, Kai and Ruder, Sebastian and Metzler, Donald},
  booktitle = {International Conference on Learning Representations},
  year      = {2022},
  url       = {https://mlanthology.org/iclr/2022/aribandi2022iclr-ext5/}
}