Cautious Ordinal Classification by Binary Decomposition
Abstract
We study the problem of performing cautious inferences for an ordinal classification (a.k.a. ordinal regression) task, that is when the possible classes are totally ordered. By cautious inference, we mean that we may produce partial predictions when available information is insufficient to provide reliable precise ones. We do so by estimating probabilistic bounds instead of precise ones. These bounds induce a (convex) set of possible probabilistic models, from which we perform inferences. As the estimates or predictions for such models are usually computationally harder to obtain than for precise ones, we study the extension of two binary decomposition strategies that remain easy to obtain and computationally efficient to manipulate when shifting from precise to bounded estimates. We demonstrate the possible usefulness of such a cautious attitude on tests performed on benchmark data sets.
Cite
Text
Destercke and Yang. "Cautious Ordinal Classification by Binary Decomposition." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2014. doi:10.1007/978-3-662-44848-9_21Markdown
[Destercke and Yang. "Cautious Ordinal Classification by Binary Decomposition." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2014.](https://mlanthology.org/ecmlpkdd/2014/destercke2014ecmlpkdd-cautious/) doi:10.1007/978-3-662-44848-9_21BibTeX
@inproceedings{destercke2014ecmlpkdd-cautious,
title = {{Cautious Ordinal Classification by Binary Decomposition}},
author = {Destercke, Sébastien and Yang, Gen},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2014},
pages = {323-337},
doi = {10.1007/978-3-662-44848-9_21},
url = {https://mlanthology.org/ecmlpkdd/2014/destercke2014ecmlpkdd-cautious/}
}