Making Good Probability Estimates for Regression
Abstract
In this paper, we show that the optimisation of density forecasting models for regression in machine learning can be formulated as a multi-objective problem. We describe the two objectives of sharpness and calibration and suggest suitable scoring metrics for both. We use the popular negative log-likelihood as a measure of sharpness and the probability integral transform as a measure of calibration. To optimise density forecasting models under multiple criteria we introduce a multi-objective evolutionary optimisation framework that can produce better density forecasts from a prediction user’s perspective. Our experiments show improvements over the state-of-the-art on a risk management problem.
Cite
Text
Carney and Cunningham. "Making Good Probability Estimates for Regression." European Conference on Machine Learning, 2006. doi:10.1007/11871842_55Markdown
[Carney and Cunningham. "Making Good Probability Estimates for Regression." European Conference on Machine Learning, 2006.](https://mlanthology.org/ecmlpkdd/2006/carney2006ecml-making/) doi:10.1007/11871842_55BibTeX
@inproceedings{carney2006ecml-making,
title = {{Making Good Probability Estimates for Regression}},
author = {Carney, Michael and Cunningham, Padraig},
booktitle = {European Conference on Machine Learning},
year = {2006},
pages = {582-589},
doi = {10.1007/11871842_55},
url = {https://mlanthology.org/ecmlpkdd/2006/carney2006ecml-making/}
}