Decision-Driven Calibration for Cost-Sensitive Uncertainty Quantification

Abstract

In recent years, the ability of artificial intelligence (AI) systems to quantity their uncertainty has become paramount in building trustworthy AI. In standard uncertainty quantification (UQ), AI uncertainty is calibrated such that the confidence of its predictions matches the statistics of the underlying data distribution. However, this method of calibration does not take into consideration the direct influence of UQ on the subsequent actions taken by downstream decision-makers. Here we demonstrate an alternate, decision-driven method of UQ calibration that explicitly minimizes the incurred costs of downstream decisions. After formulating decision-driven calibration as an optimization problem with respect to a known decision-maker, we show in a simulated search-and-rescue scenario how decision-driven temperature scaling can lead to lower incurred decision costs.

Cite

Text

Canal et al. "Decision-Driven Calibration for Cost-Sensitive Uncertainty Quantification." NeurIPS 2024 Workshops: BDU, 2024.

Markdown

[Canal et al. "Decision-Driven Calibration for Cost-Sensitive Uncertainty Quantification." NeurIPS 2024 Workshops: BDU, 2024.](https://mlanthology.org/neuripsw/2024/canal2024neuripsw-decisiondriven/)

BibTeX

@inproceedings{canal2024neuripsw-decisiondriven,
  title     = {{Decision-Driven Calibration for Cost-Sensitive Uncertainty Quantification}},
  author    = {Canal, Gregory and Leung, Vladimir and Guerrerio, John J. and Sage, Philip and Wang, I-Jeng},
  booktitle = {NeurIPS 2024 Workshops: BDU},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/canal2024neuripsw-decisiondriven/}
}