Posterior Network: Uncertainty Estimation Without OOD Samples via Density-Based Pseudo-Counts
Abstract
Accurate estimation of aleatoric and epistemic uncertainty is crucial to build safe and reliable systems. Traditional approaches, such as dropout and ensemble methods, estimate uncertainty by sampling probability predictions from different submodels, which leads to slow uncertainty estimation at inference time. Recent works address this drawback by directly predicting parameters of prior distributions over the probability predictions with a neural network. While this approach has demonstrated accurate uncertainty estimation, it requires defining arbitrary target parameters for in-distribution data and makes the unrealistic assumption that out-of-distribution (OOD) data is known at training time.
Cite
Text
Charpentier et al. "Posterior Network: Uncertainty Estimation Without OOD Samples via Density-Based Pseudo-Counts." Neural Information Processing Systems, 2020.Markdown
[Charpentier et al. "Posterior Network: Uncertainty Estimation Without OOD Samples via Density-Based Pseudo-Counts." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/charpentier2020neurips-posterior/)BibTeX
@inproceedings{charpentier2020neurips-posterior,
title = {{Posterior Network: Uncertainty Estimation Without OOD Samples via Density-Based Pseudo-Counts}},
author = {Charpentier, Bertrand and Zügner, Daniel and Günnemann, Stephan},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/charpentier2020neurips-posterior/}
}