Margin Maximizing Loss Functions
Abstract
Margin maximizing properties play an important role in the analysis of classi£- cation models, such as boosting and support vector machines. Margin maximiza- tion is theoretically interesting because it facilitates generalization error analysis, and practically interesting because it presents a clear geometric interpretation of the models being built. We formulate and prove a suf£cient condition for the solutions of regularized loss functions to converge to margin maximizing separa- tors, as the regularization vanishes. This condition covers the hinge loss of SVM, the exponential loss of AdaBoost and logistic regression loss. We also generalize it to multi-class classi£cation problems, and present margin maximizing multi- class versions of logistic regression and support vector machines.
Cite
Text
Rosset et al. "Margin Maximizing Loss Functions." Neural Information Processing Systems, 2003.Markdown
[Rosset et al. "Margin Maximizing Loss Functions." Neural Information Processing Systems, 2003.](https://mlanthology.org/neurips/2003/rosset2003neurips-margin/)BibTeX
@inproceedings{rosset2003neurips-margin,
title = {{Margin Maximizing Loss Functions}},
author = {Rosset, Saharon and Zhu, Ji and Hastie, Trevor J.},
booktitle = {Neural Information Processing Systems},
year = {2003},
pages = {1237-1244},
url = {https://mlanthology.org/neurips/2003/rosset2003neurips-margin/}
}