Variational Uncertainty Decomposition for In-Context Learning
Abstract
As large language models (LLMs) gain popularity in conducting prediction tasks in-context, understanding the sources of uncertainty in in-context learning becomes essential to ensuring reliability. The recent hypothesis of in-context learning performing predictive Bayesian inference opens the avenue for Bayesian uncertainty estimation, particularly for decomposing uncertainty into epistemic uncertainty due to lack of in-context data and aleatoric uncertainty inherent in the in-context prediction task. However, the decomposition idea remains under-explored due to the intractability of the latent parameter posterior from the underlying Bayesian model. In this work, we introduce a variational uncertainty decomposition framework for in-context learning without explicitly sampling from the latent parameter posterior, by optimising auxiliary inputs as probes to obtain an upper bound to the aleatoric uncertainty of an LLM's in-context learning procedure. Through experiments on synthetic and real-world tasks, we show quantitatively and qualitatively that the decomposed uncertainties obtained from our method exhibit desirable properties of epistemic and aleatoric uncertainty.
Cite
Text
Jayasekera et al. "Variational Uncertainty Decomposition for In-Context Learning." Advances in Neural Information Processing Systems, 2025.Markdown
[Jayasekera et al. "Variational Uncertainty Decomposition for In-Context Learning." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/jayasekera2025neurips-variational/)BibTeX
@inproceedings{jayasekera2025neurips-variational,
title = {{Variational Uncertainty Decomposition for In-Context Learning}},
author = {Jayasekera, I. Shavindra and Si, Jacob and Valdettaro, Filippo and Chen, Wenlong and Faisal, Aldo A. and Li, Yingzhen},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/jayasekera2025neurips-variational/}
}