Continual Learning in Open-Vocabulary Classification with Complementary Memory Systems
Abstract
We introduce a method for flexible and efficient continual learning in open-vocabulary image classification, drawing inspiration from the complementary learning systems observed in human cognition. Specifically, we propose to combine predictions from a CLIP zero-shot model and the exemplar-based model, using the zero-shot estimated probability that a sample's class is within the exemplar classes. We also propose a ``tree probe'' method, an adaption of lazy learning principles, which enables fast learning from new examples with competitive accuracy to batch-trained linear models. We test in data incremental, class incremental, and task incremental settings, as well as ability to perform flexible inference on varying subsets of zero-shot and learned categories. Our proposed method achieves a good balance of learning speed, target task effectiveness, and zero-shot effectiveness. Code is available at https://github.com/jessemelpolio/TreeProbe.
Cite
Text
Zhu et al. "Continual Learning in Open-Vocabulary Classification with Complementary Memory Systems." Transactions on Machine Learning Research, 2024.Markdown
[Zhu et al. "Continual Learning in Open-Vocabulary Classification with Complementary Memory Systems." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/zhu2024tmlr-continual/)BibTeX
@article{zhu2024tmlr-continual,
title = {{Continual Learning in Open-Vocabulary Classification with Complementary Memory Systems}},
author = {Zhu, Zhen and Lyu, Weijie and Xiao, Yao and Hoiem, Derek},
journal = {Transactions on Machine Learning Research},
year = {2024},
url = {https://mlanthology.org/tmlr/2024/zhu2024tmlr-continual/}
}