Two Fundamental Limits for Uncertainty Quantification in Predictive Inference
Abstract
We study the statistical hardness of estimating two basic representations of uncertainty in predictive inference: prediction sets and calibration error. First, we show that conformal prediction sets cannot approach a desired weighted conformal coverage level—with respect to a family of binary witness functions with VC dimension $d$—at a minimax rate faster than $O(d^{1/2}n^{-1/2})$. We also show that the algorithm in Gibbs et al. (2023) achieves this rate and that extending our class of conformal sets beyond thresholds of non-conformity scores to include arbitrary convex sets of non-conformity scores only improves the minimax rate by a constant factor. Then, under a similar VC dimension constraint on the witness function class, we show it is not possible to estimate the weighted weak calibration error at a minimax rate faster than $O(d^{1/4}n^{-1/2})$. We show that the algorithm in Kumar et al. (2019) achieves this rate in the particular case of estimating the squared weak calibration error of a predictor that outputs $d$ distinct values.
Cite
Text
Areces et al. "Two Fundamental Limits for Uncertainty Quantification in Predictive Inference." Conference on Learning Theory, 2024.Markdown
[Areces et al. "Two Fundamental Limits for Uncertainty Quantification in Predictive Inference." Conference on Learning Theory, 2024.](https://mlanthology.org/colt/2024/areces2024colt-two/)BibTeX
@inproceedings{areces2024colt-two,
title = {{Two Fundamental Limits for Uncertainty Quantification in Predictive Inference}},
author = {Areces, Felipe and Cheng, Chen and Duchi, John and Rohith, Kuditipudi},
booktitle = {Conference on Learning Theory},
year = {2024},
pages = {186-218},
volume = {247},
url = {https://mlanthology.org/colt/2024/areces2024colt-two/}
}