Analysis of Generic Perceptron-like Large Margin Classifiers

Abstract

We analyse perceptron-like algorithms with margin considering both the standard classification condition and a modified one which demands a specific value of the margin in the augmented space. The new algorithms are shown to converge in a finite number of steps and used to approximately locate the optimal weight vector in the augmented space. As the data are embedded in the augmented space at a larger distance from the origin the maximum margin in that space approaches the maximum geometric one in the original space. Thus, our procedures exploiting the new algorithms can be regarded as approximate maximal margin classifiers.

Cite

Text

Tsampouka and Shawe-Taylor. "Analysis of Generic Perceptron-like Large Margin Classifiers." European Conference on Machine Learning, 2005. doi:10.1007/11564096_77

Markdown

[Tsampouka and Shawe-Taylor. "Analysis of Generic Perceptron-like Large Margin Classifiers." European Conference on Machine Learning, 2005.](https://mlanthology.org/ecmlpkdd/2005/tsampouka2005ecml-analysis/) doi:10.1007/11564096_77

BibTeX

@inproceedings{tsampouka2005ecml-analysis,
  title     = {{Analysis of Generic Perceptron-like Large Margin Classifiers}},
  author    = {Tsampouka, Petroula and Shawe-Taylor, John},
  booktitle = {European Conference on Machine Learning},
  year      = {2005},
  pages     = {750-758},
  doi       = {10.1007/11564096_77},
  url       = {https://mlanthology.org/ecmlpkdd/2005/tsampouka2005ecml-analysis/}
}