Retrospective Uncertainties for Deep Models Using Vine Copulas
Abstract
Despite the major progress of deep models as learning machines, uncertainty estimation remains a major challenge. Existing solutions rely on modified loss functions or architectural changes. We propose to compensate for the lack of built-in uncertainty estimates by supplementing any network, retrospectively, with a subsequent vine copula model, in an overall compound we call Vine-Copula Neural Network (VCNN). Through synthetic and real-data experiments, we show that VCNNs could be task (regression/classification) and architecture (recurrent, fully connected) agnostic while providing reliable and better-calibrated uncertainty estimates, comparable to state-of-the-art built-in uncertainty solutions.
Cite
Text
Tagasovska et al. "Retrospective Uncertainties for Deep Models Using Vine Copulas." Artificial Intelligence and Statistics, 2023.Markdown
[Tagasovska et al. "Retrospective Uncertainties for Deep Models Using Vine Copulas." Artificial Intelligence and Statistics, 2023.](https://mlanthology.org/aistats/2023/tagasovska2023aistats-retrospective/)BibTeX
@inproceedings{tagasovska2023aistats-retrospective,
title = {{Retrospective Uncertainties for Deep Models Using Vine Copulas}},
author = {Tagasovska, Natasa and Ozdemir, Firat and Brando, Axel},
booktitle = {Artificial Intelligence and Statistics},
year = {2023},
pages = {7528-7539},
volume = {206},
url = {https://mlanthology.org/aistats/2023/tagasovska2023aistats-retrospective/}
}