Well-Calibrated Predictions from Online Compression Models
Abstract
It has been shown recently that Transductive Confidence Machine (TCM) is automatically well-calibrated when used in the on-line mode and provided that the data sequence is generated by an exchangeable distribution. In this paper we strengthen this result by relaxing the assumption of exchangeability of the data-generating distribution to the much weaker assumption that the data agrees with a given “on-line compression model”.
Cite
Text
Vovk. "Well-Calibrated Predictions from Online Compression Models." International Conference on Algorithmic Learning Theory, 2003. doi:10.1007/978-3-540-39624-6_22Markdown
[Vovk. "Well-Calibrated Predictions from Online Compression Models." International Conference on Algorithmic Learning Theory, 2003.](https://mlanthology.org/alt/2003/vovk2003alt-wellcalibrated/) doi:10.1007/978-3-540-39624-6_22BibTeX
@inproceedings{vovk2003alt-wellcalibrated,
title = {{Well-Calibrated Predictions from Online Compression Models}},
author = {Vovk, Vladimir},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {2003},
pages = {268-282},
doi = {10.1007/978-3-540-39624-6_22},
url = {https://mlanthology.org/alt/2003/vovk2003alt-wellcalibrated/}
}