Neural Weight Search for Scalable Task Incremental Learning
Abstract
Task incremental learning aims to enable a system to maintain its performance on previously learned tasks while learning new tasks, solving the problem of catastrophic forgetting. One promising approach is to build an individual network or sub-network for future tasks. However, this leads to an ever-growing memory due to saving extra weights for new tasks and how to address this issue has remained an open problem in task incremental learning. In this paper, we introduce a novel Neural Weight Search technique that designs a fixed search space where the optimal combinations of frozen weights can be searched to build new models for novel tasks in an end-to-end manner, resulting in a scalable and controllable memory growth. Extensive experiments on two benchmarks, i.e., Split-CIFAR-100 and CUB-to-Sketches, show our method achieves state-of-the-art performance with respect to both average inference accuracy and total memory cost.
Cite
Text
Jiang and Celiktutan. "Neural Weight Search for Scalable Task Incremental Learning." Winter Conference on Applications of Computer Vision, 2023.Markdown
[Jiang and Celiktutan. "Neural Weight Search for Scalable Task Incremental Learning." Winter Conference on Applications of Computer Vision, 2023.](https://mlanthology.org/wacv/2023/jiang2023wacv-neural/)BibTeX
@inproceedings{jiang2023wacv-neural,
title = {{Neural Weight Search for Scalable Task Incremental Learning}},
author = {Jiang, Jian and Celiktutan, Oya},
booktitle = {Winter Conference on Applications of Computer Vision},
year = {2023},
pages = {1390-1399},
url = {https://mlanthology.org/wacv/2023/jiang2023wacv-neural/}
}