One-Sided Support Vector Regression for Multiclass Cost-Sensitive Classification
Abstract
We propose a novel approach that reduces cost-sensitive classification to one-sided regression. The approach stores the cost information in the regression labels and encodes the minimum-cost prediction with the one-sided loss. The simple approach is accompanied by a solid theoretical guarantee of error transformation, and can be used to cast any one-sided regression method as a cost-sensitive classification algorithm. To validate the proposed reduction approach, we design a new cost-sensitive classification algorithm by coupling the approach with a variant of the support vector machine (SVM) for one-sided regression. The proposed algorithm can be viewed as a theoretically justified extension of the popular one-versus-all SVM. Experimental results demonstrate that the algorithm is not only superior to traditional one-versus-all SVM for cost-sensitive classification, but also better than many existing SVM-based cost-sensitive classification algorithms.
Cite
Text
Tu and Lin. "One-Sided Support Vector Regression for Multiclass Cost-Sensitive Classification." International Conference on Machine Learning, 2010.Markdown
[Tu and Lin. "One-Sided Support Vector Regression for Multiclass Cost-Sensitive Classification." International Conference on Machine Learning, 2010.](https://mlanthology.org/icml/2010/tu2010icml-one/)BibTeX
@inproceedings{tu2010icml-one,
title = {{One-Sided Support Vector Regression for Multiclass Cost-Sensitive Classification}},
author = {Tu, Han-Hsing and Lin, Hsuan-Tien},
booktitle = {International Conference on Machine Learning},
year = {2010},
pages = {1095-1102},
url = {https://mlanthology.org/icml/2010/tu2010icml-one/}
}