Conformal Bayesian Computation
Abstract
We develop scalable methods for producing conformal Bayesian predictive intervals with finite sample calibration guarantees. Bayesian posterior predictive distributions, $p(y \mid x)$, characterize subjective beliefs on outcomes of interest, $y$, conditional on predictors, $x$. Bayesian prediction is well-calibrated when the model is true, but the predictive intervals may exhibit poor empirical coverage when the model is misspecified, under the so called ${\cal{M}}$-open perspective. In contrast, conformal inference provides finite sample frequentist guarantees on predictive confidence intervals without the requirement of model fidelity. Using 'add-one-in' importance sampling, we show that conformal Bayesian predictive intervals are efficiently obtained from re-weighted posterior samples of model parameters. Our approach contrasts with existing conformal methods that require expensive refitting of models or data-splitting to achieve computational efficiency. We demonstrate the utility on a range of examples including extensions to partially exchangeable settings such as hierarchical models.
Cite
Text
Fong and Holmes. "Conformal Bayesian Computation." Neural Information Processing Systems, 2021.Markdown
[Fong and Holmes. "Conformal Bayesian Computation." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/fong2021neurips-conformal/)BibTeX
@inproceedings{fong2021neurips-conformal,
title = {{Conformal Bayesian Computation}},
author = {Fong, Edwin and Holmes, Chris C},
booktitle = {Neural Information Processing Systems},
year = {2021},
url = {https://mlanthology.org/neurips/2021/fong2021neurips-conformal/}
}