Approximating Full Conformal Prediction at Scale via Influence Functions
Abstract
Conformal prediction (CP) is a wrapper around traditional machine learning models, giving coverage guarantees under the sole assumption of exchangeability; in classification problems, a CP guarantees that the error rate is at most a chosen significance level, irrespective of whether the underlying model is misspecified. However, the prohibitive computational costs of full CP led researchers to design scalable alternatives, which alas do not attain the same guarantees or statistical power of full CP. In this paper, we use influence functions to efficiently approximate full CP. We prove that our method is a consistent approximation of full CP, and empirically show that the approximation error becomes smaller as the training set increases; e.g., for 1,000 training points the two methods output p-values that are
Cite
Text
Martinez et al. "Approximating Full Conformal Prediction at Scale via Influence Functions." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I6.25814Markdown
[Martinez et al. "Approximating Full Conformal Prediction at Scale via Influence Functions." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/martinez2023aaai-approximating/) doi:10.1609/AAAI.V37I6.25814BibTeX
@inproceedings{martinez2023aaai-approximating,
title = {{Approximating Full Conformal Prediction at Scale via Influence Functions}},
author = {Martinez, Javier Abad and Bhatt, Umang and Weller, Adrian and Cherubin, Giovanni},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {6631-6639},
doi = {10.1609/AAAI.V37I6.25814},
url = {https://mlanthology.org/aaai/2023/martinez2023aaai-approximating/}
}