Risk Minimization, Probability Elicitation, and Cost-Sensitive SVMs

Abstract

A new procedure for learning cost-sensitive SVM classifiers is proposed. The SVM hinge loss is extended to the cost sensitive setting, and the cost-sensitive SVM is derived as the minimizer of the associated risk. The extension of the hinge loss draws on recent connections between risk minimization and probability elicitation. These connections are generalized to cost-sensitive classification, in a manner that guarantees consistency with the cost-sensitive Bayes risk, and associated Bayes decision rule. This ensures that optimal decision rules, under the new hinge loss, implement the Bayes-optimal cost-sensitive classification boundary. Minimization of the new hinge loss is shown to be a generalization of the classic SVM optimization problem, and can be solved by identical procedures. The resulting algorithm avoids the shortcomings of previous approaches to cost-sensitive SVM design, and has superior experimental performance.

Cite

Text

Masnadi-Shirazi and Vasconcelos. "Risk Minimization, Probability Elicitation, and Cost-Sensitive SVMs." International Conference on Machine Learning, 2010.

Markdown

[Masnadi-Shirazi and Vasconcelos. "Risk Minimization, Probability Elicitation, and Cost-Sensitive SVMs." International Conference on Machine Learning, 2010.](https://mlanthology.org/icml/2010/masnadishirazi2010icml-risk/)

BibTeX

@inproceedings{masnadishirazi2010icml-risk,
  title     = {{Risk Minimization, Probability Elicitation, and Cost-Sensitive SVMs}},
  author    = {Masnadi-Shirazi, Hamed and Vasconcelos, Nuno},
  booktitle = {International Conference on Machine Learning},
  year      = {2010},
  pages     = {759-766},
  url       = {https://mlanthology.org/icml/2010/masnadishirazi2010icml-risk/}
}