Sensitive Error Correcting Output Codes
Abstract
We present a reduction from cost-sensitive classification to binary classification based on (a modification of) error correcting output codes. The reduction satisfies the property that ε regret for binary classification implies l _2-regret of at most 2 ε for cost estimation. This has several implications: 1 Any regret-minimizing online algorithm for 0/1 loss is (via the reduction) a regret-minimizing online cost-sensitive algorithm. In particular, this means that online learning can be made to work for arbitrary (i.e., totally unstructured) loss functions. 2 The output of the reduction can be thresholded so that ε regret for binary classification implies at most 4 ${\sqrt{\epsilon Z}}$ regret for cost-sensitive classification where Z is the expected sum of costs. 3 For multiclass problems, ε binary regret translates into l _2-regret of at most 4 ε in the estimation of class probabilities. For classification, this implies at most 4 ${\sqrt{\epsilon}}$ multiclass regret.
Cite
Text
Langford and Beygelzimer. "Sensitive Error Correcting Output Codes." Annual Conference on Computational Learning Theory, 2005. doi:10.1007/11503415_11Markdown
[Langford and Beygelzimer. "Sensitive Error Correcting Output Codes." Annual Conference on Computational Learning Theory, 2005.](https://mlanthology.org/colt/2005/langford2005colt-sensitive/) doi:10.1007/11503415_11BibTeX
@inproceedings{langford2005colt-sensitive,
title = {{Sensitive Error Correcting Output Codes}},
author = {Langford, John and Beygelzimer, Alina},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2005},
pages = {158-172},
doi = {10.1007/11503415_11},
url = {https://mlanthology.org/colt/2005/langford2005colt-sensitive/}
}