Probabilistic Score Estimation with Piecewise Logistic Regression
Abstract
Well-calibrated probabilities are necessary in many applications like probabilistic frameworks or cost-sensitive tasks. Based on previous success of asymmetric Laplace method in calibrating text classifiers' scores, we propose to use piecewise logistic regression, which is a simple extension of standard logistic regression, as an alternative method in the discriminative family. We show that both methods have the flexibility to be piecewise linear functions in log-odds, but they are based on quite different assumptions. We evaluated asymmetric Laplace method, piecewise logistic regression and standard logistic regression over standard text categorization collections (Reuters-21578 and TREC-AP) with three classifiers (SVM, Naive Bayes and Logistic Regression Classifier), and observed that piecewise logistic regression performs significantly better than the other two methods in the log-loss metric.
Cite
Text
Zhang and Yang. "Probabilistic Score Estimation with Piecewise Logistic Regression." International Conference on Machine Learning, 2004. doi:10.1145/1015330.1015335Markdown
[Zhang and Yang. "Probabilistic Score Estimation with Piecewise Logistic Regression." International Conference on Machine Learning, 2004.](https://mlanthology.org/icml/2004/zhang2004icml-probabilistic/) doi:10.1145/1015330.1015335BibTeX
@inproceedings{zhang2004icml-probabilistic,
title = {{Probabilistic Score Estimation with Piecewise Logistic Regression}},
author = {Zhang, Jian and Yang, Yiming},
booktitle = {International Conference on Machine Learning},
year = {2004},
doi = {10.1145/1015330.1015335},
url = {https://mlanthology.org/icml/2004/zhang2004icml-probabilistic/}
}