The Perceptron with Dynamic Margin

Abstract

The classical perceptron rule provides a varying upper bound on the maximum margin, namely the length of the current weight vector divided by the total number of updates up to that time. Requiring that the perceptron updates its internal state whenever the normalized margin of a pattern is found not to exceed a certain fraction of this dynamic upper bound we construct a new approximate maximum margin classifier called the perceptron with dynamic margin (PDM). We demonstrate that PDM converges in a finite number of steps and derive an upper bound on them. We also compare experimentally PDM with other perceptron-like algorithms and support vector machines on hard margin tasks involving linear kernels which are equivalent to 2-norm soft margin.

Cite

Text

Panagiotakopoulos and Tsampouka. "The Perceptron with Dynamic Margin." International Conference on Algorithmic Learning Theory, 2011. doi:10.1007/978-3-642-24412-4_18

Markdown

[Panagiotakopoulos and Tsampouka. "The Perceptron with Dynamic Margin." International Conference on Algorithmic Learning Theory, 2011.](https://mlanthology.org/alt/2011/panagiotakopoulos2011alt-perceptron/) doi:10.1007/978-3-642-24412-4_18

BibTeX

@inproceedings{panagiotakopoulos2011alt-perceptron,
  title     = {{The Perceptron with Dynamic Margin}},
  author    = {Panagiotakopoulos, Constantinos and Tsampouka, Petroula},
  booktitle = {International Conference on Algorithmic Learning Theory},
  year      = {2011},
  pages     = {204-218},
  doi       = {10.1007/978-3-642-24412-4_18},
  url       = {https://mlanthology.org/alt/2011/panagiotakopoulos2011alt-perceptron/}
}