Exploiting the Asymmetric Uncertainty Structure of Pre-Trained VLMs on the Unit Hypersphere
Abstract
Vision-language models (VLMs) as foundation models have significantly enhanced performance across a wide range of visual and textual tasks, without requiring large-scale training from scratch for downstream tasks. However, these deterministic VLMs fail to capture the inherent ambiguity and uncertainty in natural language and visual data. Recent probabilistic post-hoc adaptation methods address this by mapping deterministic embeddings onto probability distributions; however, existing approaches do not account for the asymmetric uncertainty between modalities, and the constraint that meaningful deterministic embeddings reside on a unit hypersphere, potentially leading to suboptimal performance. In this paper, we address the asymmetric uncertainty structure inherent in textual and visual data, and propose AsymVLM to build probabilistic embeddings from pre-trained VLMs on the unit hypersphere, enabling uncertainty quantification. We validate the effectiveness of the probabilistic embeddings on established benchmarks, and present comprehensive ablation studies demonstrating the inherent nature of asymmetry in the uncertainty structure of textual and visual data.
Cite
Text
Ju et al. "Exploiting the Asymmetric Uncertainty Structure of Pre-Trained VLMs on the Unit Hypersphere." Advances in Neural Information Processing Systems, 2025.Markdown
[Ju et al. "Exploiting the Asymmetric Uncertainty Structure of Pre-Trained VLMs on the Unit Hypersphere." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/ju2025neurips-exploiting/)BibTeX
@inproceedings{ju2025neurips-exploiting,
title = {{Exploiting the Asymmetric Uncertainty Structure of Pre-Trained VLMs on the Unit Hypersphere}},
author = {Ju, Li and Andersson, Max and Fredriksson, Stina and Glöckner, Edward and Hellander, Andreas and Vats, Ekta and Singh, Prashant},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/ju2025neurips-exploiting/}
}