Classification with Support Hyperplanes
Abstract
A new classification method is proposed, called Support Hyperplanes (SHs). To solve the binary classification task, SHs consider the set of all hyperplanes that do not make classification mistakes, referred to as semi-consistent hyperplanes. A test object is classified using that semi-consistent hyperplane, which is farthest away from it. In this way, a good balance between goodness-of-fit and model complexity is achieved, where model complexity is proxied by the distance between a test object and a semi-consistent hyperplane. This idea of complexity resembles the one imputed in the width of the so-called margin between two classes, which arises in the context of Support Vector Machine learning. Class overlap can be handled via the introduction of kernels and/or slack variables. The performance of SHs against standard classifiers is promising on several widely-used empirical data sets.
Cite
Text
Nalbantov et al. "Classification with Support Hyperplanes." European Conference on Machine Learning, 2006. doi:10.1007/11871842_70Markdown
[Nalbantov et al. "Classification with Support Hyperplanes." European Conference on Machine Learning, 2006.](https://mlanthology.org/ecmlpkdd/2006/nalbantov2006ecml-classification/) doi:10.1007/11871842_70BibTeX
@inproceedings{nalbantov2006ecml-classification,
title = {{Classification with Support Hyperplanes}},
author = {Nalbantov, Georgi I. and Bioch, Jan C. and Groenen, Patrick J. F.},
booktitle = {European Conference on Machine Learning},
year = {2006},
pages = {703-710},
doi = {10.1007/11871842_70},
url = {https://mlanthology.org/ecmlpkdd/2006/nalbantov2006ecml-classification/}
}