Margins, Kernels and Non-Linear Smoothed Perceptrons

Abstract

We focus on the problem of finding a non-linear classification function that lies in a Reproducing Kernel Hilbert Space (RKHS) both from the primal point of view (finding a perfect separator when one exists) and the dual point of view (giving a certificate of non-existence), with special focus on generalizations of two classical schemes - the Perceptron (primal) and Von-Neumann (dual) algorithms. We cast our problem as one of maximizing the regularized normalized hard-margin (ρ) in an RKHS and use the Representer Theorem to rephrase it in terms of a Mahalanobis dot-product/semi-norm associated with the kernel’s (normalized and signed) Gram matrix. We derive an accelerated smoothed algorithm with a convergence rate of \tfrac\sqrt \log nρ given n separable points, which is strikingly similar to the classical kernelized Perceptron algorithm whose rate is \tfrac1ρ^2. When no such classifier exists, we prove a version of Gordan’s separation theorem for RKHSs, and give a reinterpretation of negative margins. This allows us to give guarantees for a primal-dual algorithm that halts in \min{\tfrac\sqrt n|ρ|, \tfrac\sqrt nε} iterations with a perfect separator in the RKHS if the primal is feasible or a dual ε-certificate of near-infeasibility.

Cite

Text

Ramdas and Peña. "Margins, Kernels and Non-Linear Smoothed Perceptrons." International Conference on Machine Learning, 2014.

Markdown

[Ramdas and Peña. "Margins, Kernels and Non-Linear Smoothed Perceptrons." International Conference on Machine Learning, 2014.](https://mlanthology.org/icml/2014/ramdas2014icml-margins/)

BibTeX

@inproceedings{ramdas2014icml-margins,
  title     = {{Margins, Kernels and Non-Linear Smoothed Perceptrons}},
  author    = {Ramdas, Aaditya and Peña, Javier},
  booktitle = {International Conference on Machine Learning},
  year      = {2014},
  pages     = {244-252},
  volume    = {32},
  url       = {https://mlanthology.org/icml/2014/ramdas2014icml-margins/}
}