Empirically Validating Conformal Prediction on Modern Vision Architectures Under Distribution Shift and Long-Tailed Data
Abstract
Conformal prediction has emerged as a rigorous means of providing deep learning models with reliable uncertainty estimates and safety guarantees. Yet, its performance is known to degrade under distribution shift and long-tailed class distributions, which are often present in real world applications. Here, we characterize the performance of several post-hoc and training-based conformal prediction methods under these settings, providing the first empirical evaluation on large-scale datasets and models. We show that across numerous conformal methods and neural network families, performance greatly degrades under distribution shifts violating safety guarantees. Similarly, we show that in long-tailed settings the guarantees are frequently violated on many classes. Understanding the limitations of these methods is necessary for deployment in real world and safety-critical applications.
Cite
Text
Kasa and Taylor. "Empirically Validating Conformal Prediction on Modern Vision Architectures Under Distribution Shift and Long-Tailed Data." ICML 2023 Workshops: SPIGM, 2023.Markdown
[Kasa and Taylor. "Empirically Validating Conformal Prediction on Modern Vision Architectures Under Distribution Shift and Long-Tailed Data." ICML 2023 Workshops: SPIGM, 2023.](https://mlanthology.org/icmlw/2023/kasa2023icmlw-empirically/)BibTeX
@inproceedings{kasa2023icmlw-empirically,
title = {{Empirically Validating Conformal Prediction on Modern Vision Architectures Under Distribution Shift and Long-Tailed Data}},
author = {Kasa, Kevin and Taylor, Graham W.},
booktitle = {ICML 2023 Workshops: SPIGM},
year = {2023},
url = {https://mlanthology.org/icmlw/2023/kasa2023icmlw-empirically/}
}