Evolving Parameterized Prompt Memory for Continual Learning
Abstract
Recent studies have demonstrated the potency of leveraging prompts in Transformers for continual learning (CL). Nevertheless, employing a discrete key-prompt bottleneck can lead to selection mismatches and inappropriate prompt associations during testing. Furthermore, this approach hinders adaptive prompting due to the lack of shareability among nearly identical instances at more granular level. To address these challenges, we introduce the Evolving Parameterized Prompt Memory (EvoPrompt), a novel method involving adaptive and continuous prompting attached to pre-trained Vision Transformer (ViT), conditioned on specific instance. We formulate a continuous prompt function as a neural bottleneck and encode the collection of prompts on network weights. We establish a paired prompt memory system consisting of a stable reference and a flexible working prompt memory. Inspired by linear mode connectivity, we progressively fuse the working prompt memory and reference prompt memory during inter-task periods, resulting in continually evolved prompt memory. This fusion involves aligning functionally equivalent prompts using optimal transport and aggregating them in parameter space with an adjustable bias based on prompt node attribution. Additionally, to enhance backward compatibility, we propose compositional classifier initialization, which leverages prior prototypes from pre-trained models to guide the initialization of new classifiers in a subspace-aware manner. Comprehensive experiments validate that our approach achieves state-of-the-art performance in both class and domain incremental learning scenarios.
Cite
Text
Kurniawan et al. "Evolving Parameterized Prompt Memory for Continual Learning." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I12.29231Markdown
[Kurniawan et al. "Evolving Parameterized Prompt Memory for Continual Learning." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/kurniawan2024aaai-evolving/) doi:10.1609/AAAI.V38I12.29231BibTeX
@inproceedings{kurniawan2024aaai-evolving,
title = {{Evolving Parameterized Prompt Memory for Continual Learning}},
author = {Kurniawan, Muhammad Rifki and Song, Xiang and Ma, Zhiheng and He, Yuhang and Gong, Yihong and Qi, Yang and Wei, Xing},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2024},
pages = {13301-13309},
doi = {10.1609/AAAI.V38I12.29231},
url = {https://mlanthology.org/aaai/2024/kurniawan2024aaai-evolving/}
}