Online Platt Scaling with Calibeating
Abstract
We present an online post-hoc calibration method, called Online Platt Scaling (OPS), which combines the Platt scaling technique with online logistic regression. We demonstrate that OPS smoothly adapts between i.i.d. and non-i.i.d. settings with distribution drift. Further, in scenarios where the best Platt scaling model is itself miscalibrated, we enhance OPS by incorporating a recently developed technique called calibeating to make it more robust. Theoretically, our resulting OPS+calibeating method is guaranteed to be calibrated for adversarial outcome sequences. Empirically, it is effective on a range of synthetic and real-world datasets, with and without distribution drifts, achieving superior performance without hyperparameter tuning. Finally, we extend all OPS ideas to the beta scaling method.
Cite
Text
Gupta and Ramdas. "Online Platt Scaling with Calibeating." International Conference on Machine Learning, 2023.Markdown
[Gupta and Ramdas. "Online Platt Scaling with Calibeating." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/gupta2023icml-online/)BibTeX
@inproceedings{gupta2023icml-online,
title = {{Online Platt Scaling with Calibeating}},
author = {Gupta, Chirag and Ramdas, Aaditya},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {12182-12204},
volume = {202},
url = {https://mlanthology.org/icml/2023/gupta2023icml-online/}
}