A Unified View on Multi-Class Support Vector Classification

Abstract

A unified view on multi-class support vector machines (SVMs) is presented, covering most prominent variants including the one- vs-all approach and the algorithms proposed by Weston & Watkins, Crammer & Singer, Lee, Lin, & Wahba, and Liu & Yuan. The unification leads to a template for the quadratic training problems and new multi-class SVM formulations. Within our framework, we provide a comparative analysis of the various notions of multi-class margin and margin-based loss. In particular, we demonstrate limitations of the loss function considered, for instance, in the Crammer & Singer machine.

Cite

Text

Doğan et al. "A Unified View on Multi-Class Support Vector Classification." Journal of Machine Learning Research, 2016.

Markdown

[Doğan et al. "A Unified View on Multi-Class Support Vector Classification." Journal of Machine Learning Research, 2016.](https://mlanthology.org/jmlr/2016/dogan2016jmlr-unified/)

BibTeX

@article{dogan2016jmlr-unified,
  title     = {{A Unified View on Multi-Class Support Vector Classification}},
  author    = {Doğan, Ürün and Glasmachers, Tobias and Igel, Christian},
  journal   = {Journal of Machine Learning Research},
  year      = {2016},
  pages     = {1-32},
  volume    = {17},
  url       = {https://mlanthology.org/jmlr/2016/dogan2016jmlr-unified/}
}