Meta-Calibration: Learning of Model Calibration Using Differentiable Expected Calibration Error

Abstract

Calibration of neural networks is a topical problem that is becoming more and more important as neural networks increasingly underpin real-world applications. The problem is especially noticeable when using modern neural networks, for which there is a significant difference between the confidence of the model and the probability of correct prediction. Various strategies have been proposed to improve calibration, yet accurate calibration remains challenging. We propose a novel framework with two contributions: introducing a new differentiable surrogate for expected calibration error (DECE) that allows calibration quality to be directly optimised, and a meta-learning framework that uses DECE to optimise for validation set calibration with respect to model hyper-parameters. The results show that we achieve competitive performance with existing calibration approaches. Our framework opens up a new avenue and toolset for tackling calibration, which we believe will inspire further work on this important challenge.

Cite

Text

Bohdal et al. "Meta-Calibration: Learning of Model Calibration Using Differentiable Expected Calibration Error." Transactions on Machine Learning Research, 2023.

Markdown

[Bohdal et al. "Meta-Calibration: Learning of Model Calibration Using Differentiable Expected Calibration Error." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/bohdal2023tmlr-metacalibration/)

BibTeX

@article{bohdal2023tmlr-metacalibration,
  title     = {{Meta-Calibration: Learning of Model Calibration Using Differentiable Expected Calibration Error}},
  author    = {Bohdal, Ondrej and Yang, Yongxin and Hospedales, Timothy},
  journal   = {Transactions on Machine Learning Research},
  year      = {2023},
  url       = {https://mlanthology.org/tmlr/2023/bohdal2023tmlr-metacalibration/}
}