Learning to Prompt Knowledge Transfer for Open-World Continual Learning
Abstract
This paper studies the problem of continual learning in an open-world scenario, referred to as Open-world Continual Learning (OwCL). OwCL is increasingly rising while it is highly challenging in two-fold: i) learning a sequence of tasks without forgetting knowns in the past, and ii) identifying unknowns (novel objects/classes) in the future. Existing OwCL methods suffer from the adaptability of task-aware boundaries between knowns and unknowns, and do not consider the mechanism of knowledge transfer. In this work, we propose Pro-KT, a novel prompt-enhanced knowledge transfer model for OwCL. Pro-KT includes two key components: (1) a prompt bank to encode and transfer both task-generic and task-specific knowledge, and (2) a task-aware open-set boundary to identify unknowns in the new tasks. Experimental results using two real-world datasets demonstrate that the proposed Pro-KT outperforms the state-of-the-art counterparts in both the detection of unknowns and the classification of knowns markedly. Code released at https://github.com/YujieLi42/Pro-KT.
Cite
Text
Li et al. "Learning to Prompt Knowledge Transfer for Open-World Continual Learning." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I12.29275Markdown
[Li et al. "Learning to Prompt Knowledge Transfer for Open-World Continual Learning." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/li2024aaai-learning-e/) doi:10.1609/AAAI.V38I12.29275BibTeX
@inproceedings{li2024aaai-learning-e,
title = {{Learning to Prompt Knowledge Transfer for Open-World Continual Learning}},
author = {Li, Yujie and Yang, Xin and Wang, Hao and Wang, Xiangkun and Li, Tianrui},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2024},
pages = {13700-13708},
doi = {10.1609/AAAI.V38I12.29275},
url = {https://mlanthology.org/aaai/2024/li2024aaai-learning-e/}
}