Fast and Scalable Score-Based Kernel Calibration Tests
Abstract
We introduce the Kernel Calibration Conditional Stein Discrepancy test (KCCSD test), a nonparametric, kernel-based test for assessing the calibration of probabilistic models with well-defined scores. In contrast to previous methods, our test avoids the need for possibly expensive expectation approximations while providing control over its type-I error. We achieve these improvements by using a new family of kernels for score-based probabilities that can be estimated without probability density samples, and by using a Conditional Goodness of Fit criterion for the KCCSD test’s U-statistic. We demonstrate the properties of our test on various synthetic settings.
Cite
Text
Glaser et al. "Fast and Scalable Score-Based Kernel Calibration Tests." Uncertainty in Artificial Intelligence, 2023.Markdown
[Glaser et al. "Fast and Scalable Score-Based Kernel Calibration Tests." Uncertainty in Artificial Intelligence, 2023.](https://mlanthology.org/uai/2023/glaser2023uai-fast/)BibTeX
@inproceedings{glaser2023uai-fast,
title = {{Fast and Scalable Score-Based Kernel Calibration Tests}},
author = {Glaser, Pierre and Widmann, David and Lindsten, Fredrik and Gretton, Arthur},
booktitle = {Uncertainty in Artificial Intelligence},
year = {2023},
pages = {691-700},
volume = {216},
url = {https://mlanthology.org/uai/2023/glaser2023uai-fast/}
}