Tackling Non-Forgetting and Forward Transfer with a Unified Lifelong Learning Approach
Abstract
Humans are the best example of agents that can learn a variety of skills incrementally over the course of their lives, and imbuing machines with this skill is the goal of lifelong machine learning. Ideally, lifelong learning should achieve non-forgetting, forward and backward transfer, avoid confusion, support few-shot learning, and so on. In previous approaches, the focus has been given to subsets of these properties, often by fitting together with an array of separate mechanisms. In this work, we propose a simple yet powerful unified framework that supports almost all of these properties through {\em one} central consolidation mechanism. We then describe a particular instance of this framework designed to support non-forgetting and forward transfer. This novel approach works by efficiently locating sparse neural sub-networks and controlling their consolidation during lifelong learning.
Cite
Text
Yun et al. "Tackling Non-Forgetting and Forward Transfer with a Unified Lifelong Learning Approach." ICML 2020 Workshops: LifelongML, 2020.Markdown
[Yun et al. "Tackling Non-Forgetting and Forward Transfer with a Unified Lifelong Learning Approach." ICML 2020 Workshops: LifelongML, 2020.](https://mlanthology.org/icmlw/2020/yun2020icmlw-tackling/)BibTeX
@inproceedings{yun2020icmlw-tackling,
title = {{Tackling Non-Forgetting and Forward Transfer with a Unified Lifelong Learning Approach}},
author = {Yun, Xinyu and Bohn, Tanner A and Ling, Charles X.},
booktitle = {ICML 2020 Workshops: LifelongML},
year = {2020},
url = {https://mlanthology.org/icmlw/2020/yun2020icmlw-tackling/}
}