Learning Large Margin Classifiers Locally and Globally

Abstract

A new large margin classifier, named Maxi-Min Margin Machine ($\mathrm{M^4}$)is proposed in this paper. This new classifier is constructed based on both a``local" and a ``global" view of data, while the most popular large marginclassifier, Support Vector Machine (SVM) and the recently-proposed importantmodel, Minimax Probability Machine (MPM) consider data only either locally or globally. This new model is theoretically important in the sense that SVM andMPM can both be considered as its special case. Furthermore, the optimizationof $\mathrm{M^4}$ can be cast as a sequential conic programming problem, whichcan be solved efficiently. We describe the $\mathrm{M}^4$ model definition, provide a clear geometrical interpretation, present theoreticaljustifications, propose efficient solving methods, and perform a series ofevaluations on both synthetic data sets and real world benchmark data sets. Its comparison with SVM and MPM also demonstrates the advantages of our newmodel.

Cite

Text

Huang et al. "Learning Large Margin Classifiers Locally and Globally." International Conference on Machine Learning, 2004. doi:10.1145/1015330.1015365

Markdown

[Huang et al. "Learning Large Margin Classifiers Locally and Globally." International Conference on Machine Learning, 2004.](https://mlanthology.org/icml/2004/huang2004icml-learning/) doi:10.1145/1015330.1015365

BibTeX

@inproceedings{huang2004icml-learning,
  title     = {{Learning Large Margin Classifiers Locally and Globally}},
  author    = {Huang, Kaizhu and Yang, Haiqin and King, Irwin and Lyu, Michael R.},
  booktitle = {International Conference on Machine Learning},
  year      = {2004},
  doi       = {10.1145/1015330.1015365},
  url       = {https://mlanthology.org/icml/2004/huang2004icml-learning/}
}