DeCoOp: Robust Prompt Tuning with Out-of-Distribution Detection

Abstract

Vision-language models (VLMs), such as CLIP, have demonstrated impressive zero-shot capabilities for various downstream tasks. Their performance can be further enhanced through few-shot prompt tuning methods. However, current studies evaluate the performance of learned prompts separately on base and new classes. This evaluation lacks practicality for real-world applications since downstream tasks cannot determine whether the data belongs to base or new classes in advance. In this paper, we explore a problem setting called O*pen-world Prompt Tuning (OPT), which involves tuning prompts on base classes and evaluating on a combination of base and new classes. By introducing Decomposed Prompt Tuning framework (DePT), we theoretically demonstrate that OPT can be solved by incorporating out-of-distribution detection into prompt tuning, thereby enhancing the base-to-new discriminability. Based on DePT, we present a novel prompt tuning approach, namely, Decomposed Context Op*timization (DeCoOp), which introduces new-class detectors and sub-classifiers to further enhance the base-class and new-class discriminability. Experimental results on 11 benchmark datasets validate the effectiveness of DePT and demonstrate that DeCoOp outperforms current state-of-the-art methods, providing a significant 2% average accuracy improvement.

Cite

Text

Zhou et al. "DeCoOp: Robust Prompt Tuning with Out-of-Distribution Detection." International Conference on Machine Learning, 2024.

Markdown

[Zhou et al. "DeCoOp: Robust Prompt Tuning with Out-of-Distribution Detection." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/zhou2024icml-decoop/)

BibTeX

@inproceedings{zhou2024icml-decoop,
  title     = {{DeCoOp: Robust Prompt Tuning with Out-of-Distribution Detection}},
  author    = {Zhou, Zhi and Yang, Ming and Shi, Jiang-Xin and Guo, Lan-Zhe and Li, Yu-Feng},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {62161-62177},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/zhou2024icml-decoop/}
}