CoCoA-Mix: Confusion-and-Confidence-Aware Mixture Model for Context Optimization

Abstract

Prompt tuning, which adapts vision-language models by freezing model parameters and opti- mizing only the prompt, has proven effective for task-specific adaptations. The core challenge in prompt tuning is improving specialization for a specific task and generalization for unseen domains. However, frozen encoders often produce misaligned features, leading to confusion between classes and limiting specialization. To overcome this issue, we propose a confusion-aware loss (CoA-loss) that improves specialization by refining the decision boundaries between confusing classes. Additionally, we mathematically demonstrate that a mixture model can enhance generalization without compromising specialization. This is achieved using confidence-aware weights (CoA- weights), which adjust the weights of each prediction in the mixture model based on its confidence within the class domains. Extensive experiments show that CoCoA-Mix, a mixture model with CoA-loss and CoA-weights, outperforms state-of-the-art methods by enhancing specialization and generalization. Our code is publicly available at https://github.com/url-kaist/CoCoA-Mix

Cite

Text

Hong et al. "CoCoA-Mix: Confusion-and-Confidence-Aware Mixture Model for Context Optimization." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Hong et al. "CoCoA-Mix: Confusion-and-Confidence-Aware Mixture Model for Context Optimization." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/hong2025icml-cocoamix/)

BibTeX

@inproceedings{hong2025icml-cocoamix,
  title     = {{CoCoA-Mix: Confusion-and-Confidence-Aware Mixture Model for Context Optimization}},
  author    = {Hong, Dasol and Lee, Wooju and Myung, Hyun},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {23700-23721},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/hong2025icml-cocoamix/}
}