Conformal Prediction Sets Can Cause Disparate Impact

Abstract

Conformal prediction is a statistically rigorous method for quantifying uncertainty in models by having them output sets of predictions, with larger sets indicating more uncertainty. However, prediction sets are not inherently actionable; many applications require a single output to act on, not several. To overcome this limitation, prediction sets can be provided to a human who then makes an informed decision. In any such system it is crucial to ensure the fairness of outcomes across protected groups, and researchers have proposed that Equalized Coverage be used as the standard for fairness. By conducting experiments with human participants, we demonstrate that providing prediction sets can lead to disparate impact in decisions. Disquietingly, we find that providing sets that satisfy Equalized Coverage actually increases disparate impact compared to marginal coverage. Instead of equalizing coverage, we propose to equalize set sizes across groups which empirically leads to lower disparate impact.

Cite

Text

Cresswell et al. "Conformal Prediction Sets Can Cause Disparate Impact." International Conference on Learning Representations, 2025.

Markdown

[Cresswell et al. "Conformal Prediction Sets Can Cause Disparate Impact." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/cresswell2025iclr-conformal/)

BibTeX

@inproceedings{cresswell2025iclr-conformal,
  title     = {{Conformal Prediction Sets Can Cause Disparate Impact}},
  author    = {Cresswell, Jesse C. and Kumar, Bhargava and Sui, Yi and Belbahri, Mouloud},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/cresswell2025iclr-conformal/}
}