Error Limiting Reductions Between Classification Tasks
Abstract
We introduce a reduction-based model for analyzing supervised learning tasks. We use this model to devise a new reduction from multi-class cost-sensitive classification to binary classification with the following guarantee: If the learned binary classifier has error rate at most ε then the cost-sensitive classifier has cost at most 2ε times the expected sum of costs of all possible lables. Since cost-sensitive classification can embed any bounded loss finite choice supervised learning task, this result shows that any such task can be solved using a binary classification oracle. Finally, we present experimental results showing that our new reduction outperforms existing algorithms for multi-class cost-sensitive learning.
Cite
Text
Beygelzimer et al. "Error Limiting Reductions Between Classification Tasks." International Conference on Machine Learning, 2005. doi:10.1145/1102351.1102358Markdown
[Beygelzimer et al. "Error Limiting Reductions Between Classification Tasks." International Conference on Machine Learning, 2005.](https://mlanthology.org/icml/2005/beygelzimer2005icml-error/) doi:10.1145/1102351.1102358BibTeX
@inproceedings{beygelzimer2005icml-error,
title = {{Error Limiting Reductions Between Classification Tasks}},
author = {Beygelzimer, Alina and Dani, Varsha and Hayes, Thomas P. and Langford, John and Zadrozny, Bianca},
booktitle = {International Conference on Machine Learning},
year = {2005},
pages = {49-56},
doi = {10.1145/1102351.1102358},
url = {https://mlanthology.org/icml/2005/beygelzimer2005icml-error/}
}