Tasks Without Borders: A New Approach to Online Multi-Task Learning

Abstract

We introduce MTLAB, a new algorithm for learning multiple related tasks with strong theoretical guarantees. Its key idea is to perform learning sequentially over the data of all tasks, without interruptions or restarts at task boundaries. Predictors for individual tasks are derived from this process by an additional online-to-batch conversion step. By learning across task boundaries, MTLAB achieves a sublinear regret of true risks in the number of tasks. In the lifelong learning setting, this leads to an improved generalization bound that converges with the total number of samples across all observed tasks, instead of the number of examples per tasks or the number of tasks independently. At the same time, it is widely applicable: it can handle finite sets of tasks, as common in multi-task learning, as well as stochastic task sequences, as studied in lifelong learning.

Cite

Text

Zimin and Lampert. "Tasks Without Borders: A New Approach to Online Multi-Task Learning." ICML 2019 Workshops: AMTL, 2019.

Markdown

[Zimin and Lampert. "Tasks Without Borders: A New Approach to Online Multi-Task Learning." ICML 2019 Workshops: AMTL, 2019.](https://mlanthology.org/icmlw/2019/zimin2019icmlw-tasks/)

BibTeX

@inproceedings{zimin2019icmlw-tasks,
  title     = {{Tasks Without Borders: A New Approach to Online Multi-Task Learning}},
  author    = {Zimin, Alexander and Lampert, Christoph H.},
  booktitle = {ICML 2019 Workshops: AMTL},
  year      = {2019},
  url       = {https://mlanthology.org/icmlw/2019/zimin2019icmlw-tasks/}
}