Cost-Sensitive Multiclass Classification Risk Bounds
Abstract
A commonly used approach to multiclass classification is to replace the 0-1 loss with a convex surrogate so as to make empirical risk minimization computationally tractable. Previous work has uncovered sufficient and necessary conditions for the consistency of the resulting procedures. In this paper, we strengthen these results by showing how the 0-1 excess loss of a predictor can be upper bounded as a function of the excess loss of the predictor measured using the convex surrogate. The bound is developed for the case of cost-sensitive multiclass classification and a convex surrogate loss that goes back to the work of Lee, Lin and Wahba. The bounds are as easy to calculate as in binary classification. Furthermore, we also show that our analysis extends to the analysis of the recently introduced “Simplex Coding” scheme.
Cite
Text
Ávila Pires et al. "Cost-Sensitive Multiclass Classification Risk Bounds." International Conference on Machine Learning, 2013.Markdown
[Ávila Pires et al. "Cost-Sensitive Multiclass Classification Risk Bounds." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/avilapires2013icml-costsensitive/)BibTeX
@inproceedings{avilapires2013icml-costsensitive,
title = {{Cost-Sensitive Multiclass Classification Risk Bounds}},
author = {Ávila Pires, Bernardo and Szepesvari, Csaba and Ghavamzadeh, Mohammad},
booktitle = {International Conference on Machine Learning},
year = {2013},
pages = {1391-1399},
volume = {28},
url = {https://mlanthology.org/icml/2013/avilapires2013icml-costsensitive/}
}