A Short Note About the Application of Polynomial Kernels with Fractional Degree in Support Vector Learning

Abstract

In the mid 90's a fundamental new Machine Learning approach was developed by V. N. Vapnik: The Support Vector Machine (SVM). This new method can be regarded as a very promising approach and is getting more and more attention in the fields where neural networks and decision tree methods are applied. Whilst neural networks may be considered (correctly or not) to be well understood and are in wide use, Support Vector Learning has some rough edges in theoretical details and its inherent numerical tasks prevent it from being easily applied in practice. This paper picks up a new aspect - the use of fractional degrees on polynomial kernels in the SVM - discovered in the course of an implementation of the algorithm. Fractional degrees on polynomial kernels broaden the capabilities of the SVM and offer the possibility to deal with feature spaces of infinite dimension. We introduce a method to simplify the quadratic programming problem, as the core of the SVM.

Cite

Text

Rossius et al. "A Short Note About the Application of Polynomial Kernels with Fractional Degree in Support Vector Learning." European Conference on Machine Learning, 1998. doi:10.1007/BFB0026684

Markdown

[Rossius et al. "A Short Note About the Application of Polynomial Kernels with Fractional Degree in Support Vector Learning." European Conference on Machine Learning, 1998.](https://mlanthology.org/ecmlpkdd/1998/rossius1998ecml-short/) doi:10.1007/BFB0026684

BibTeX

@inproceedings{rossius1998ecml-short,
  title     = {{A Short Note About the Application of Polynomial Kernels with Fractional Degree in Support Vector Learning}},
  author    = {Rossius, Rolf and Zenker, Gérard and Ittner, Andreas and Dilger, Werner},
  booktitle = {European Conference on Machine Learning},
  year      = {1998},
  pages     = {143-148},
  doi       = {10.1007/BFB0026684},
  url       = {https://mlanthology.org/ecmlpkdd/1998/rossius1998ecml-short/}
}