kNN-CLIP: Retrieval Enables Training-Free Segmentation on Continually Expanding Large Vocabularies
Abstract
Continual segmentation has not yet tackled the challenge of improving open-vocabulary segmentation models with training data for accurate segmentation across large, continually expanding vocabularies. We discover that traditional continual training results in severe catastrophic forgetting, failing to outperform a zero-shot segmentation baseline. We introduce a novel training-free strategy, kNN-CLIP, which augments the model with a database of instance embeddings for semantic and panoptic segmentation that achieves zero forgetting. We demonstrate that kNN-CLIP can adapt to continually growing vocabularies without the need for retraining or large memory costs. kNN-CLIP enables open-vocabulary segmentation methods to expand their vocabularies on any domain with a single pass through the data, while only storing compact embeddings. This approach minimizes both compute and memory costs. kNN-CLIP achieves state-of-the-art performance across large-vocabulary semantic and panoptic segmentation datasets. We hope kNN-CLIP represents a significant step forward in enabling more efficient and adaptable continual segmentation, paving the way for advances in real-world large-vocabulary continual segmentation methods.
Cite
Text
Gui et al. "kNN-CLIP: Retrieval Enables Training-Free Segmentation on Continually Expanding Large Vocabularies." Transactions on Machine Learning Research, 2024.Markdown
[Gui et al. "kNN-CLIP: Retrieval Enables Training-Free Segmentation on Continually Expanding Large Vocabularies." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/gui2024tmlr-knnclip/)BibTeX
@article{gui2024tmlr-knnclip,
title = {{kNN-CLIP: Retrieval Enables Training-Free Segmentation on Continually Expanding Large Vocabularies}},
author = {Gui, Zhongrui and Sun, Shuyang and Li, Runjia and Yuan, Jianhao and An, Zhaochong and Roth, Karsten and Prabhu, Ameya and Torr, Philip},
journal = {Transactions on Machine Learning Research},
year = {2024},
url = {https://mlanthology.org/tmlr/2024/gui2024tmlr-knnclip/}
}