Should We Be Pre-Training? an Argument for End-Task Aware Training as an Alternative

Abstract

In most settings of practical concern, machine learning practitioners know in advance what end-task they wish to boost with auxiliary tasks. However, widely used methods for leveraging auxiliary data like pre-training and its continued-pretraining variant are end-task agnostic: they rarely, if ever, exploit knowledge of the target task. We study replacing end-task agnostic continued training of pre-trained language models with end-task aware training of said models. We argue that for sufficiently important end-tasks, the benefits of leveraging auxiliary data in a task-aware fashion can justify forgoing the traditional approach of obtaining generic, end-task agnostic representations as with (continued) pre-training. On three different low-resource NLP tasks from two domains, we demonstrate that multi-tasking the end-task and auxiliary objectives results in significantly better downstream task performance than the widely-used task-agnostic continued pre-training paradigm of Gururangan et al. (2020). We next introduce an online meta-learning algorithm that learns a set of multi-task weights to better balance among our multiple auxiliary objectives, achieving further improvements on end-task performance and data efficiency.

Cite

Text

Dery et al. "Should We Be Pre-Training? an Argument for End-Task Aware Training as an Alternative." International Conference on Learning Representations, 2022.

Markdown

[Dery et al. "Should We Be Pre-Training? an Argument for End-Task Aware Training as an Alternative." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/dery2022iclr-we/)

BibTeX

@inproceedings{dery2022iclr-we,
  title     = {{Should We Be Pre-Training? an Argument for End-Task Aware Training as an Alternative}},
  author    = {Dery, Lucio M. and Michel, Paul and Talwalkar, Ameet and Neubig, Graham},
  booktitle = {International Conference on Learning Representations},
  year      = {2022},
  url       = {https://mlanthology.org/iclr/2022/dery2022iclr-we/}
}