An Efficient Classifier Based on Hierarchical Mixing Linear Support Vector Machines

Abstract

Support vector machines (SVMs) play a very dominant role in data classification due to their good generalization performance. However, they suffer from the high computational complexity in the classification phase when there are a considerable number of support vectors (SVs). Then it is desirable to design efficient algorithms in the classification phase to deal with the datasets of real-time pattern recognition systems. To this end, we propose a novel classifier called HMLSVMs (Hierarchical Mixing Linear Support Vector Machines) in this paper, which has a hierarchical structure with a mixing linear SVMs classifier at each node and predicts the label of a sample using only a few hyperplanes. We also give a generalization error bound for the class of locally linear SVMs (LLSVMs) based on the Rademacher theory, which ensures that overfitting can be effectively avoided. Experimental evaluations shows, while maintaining a comparable classification performance to kernel SVMs (KSVMs), the proposed classifier achieves the high efficiency in the classification stage.

Cite

Text

Wang et al. "An Efficient Classifier Based on Hierarchical Mixing Linear Support Vector Machines." International Joint Conference on Artificial Intelligence, 2015.

Markdown

[Wang et al. "An Efficient Classifier Based on Hierarchical Mixing Linear Support Vector Machines." International Joint Conference on Artificial Intelligence, 2015.](https://mlanthology.org/ijcai/2015/wang2015ijcai-efficient/)

BibTeX

@inproceedings{wang2015ijcai-efficient,
  title     = {{An Efficient Classifier Based on Hierarchical Mixing Linear Support Vector Machines}},
  author    = {Wang, Di and Zhang, Xiaoqin and Fan, Mingyu and Ye, Xiuzi},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2015},
  pages     = {3897-3903},
  url       = {https://mlanthology.org/ijcai/2015/wang2015ijcai-efficient/}
}