An Improved 1-Norm SVM for Simultaneous Classification and Variable Selection
Abstract
We propose a novel extension of the 1-norm support vector machine (SVM) for simultaneous feature selection and classification. The new algorithm penalizes the empirical hinge loss by the adaptively weighted 1-norm penalty in which the weights are computed by the 2-norm SVM. Hence the new algorithm is called the hybrid SVM. Simulation and real data examples show that the hybrid SVM not only often improves upon the 1-norm SVM in terms of classification accuracy but also enjoys better feature selection performance.
Cite
Text
Zou. "An Improved 1-Norm SVM for Simultaneous Classification and Variable Selection." Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, 2007.Markdown
[Zou. "An Improved 1-Norm SVM for Simultaneous Classification and Variable Selection." Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, 2007.](https://mlanthology.org/aistats/2007/zou2007aistats-improved/)BibTeX
@inproceedings{zou2007aistats-improved,
title = {{An Improved 1-Norm SVM for Simultaneous Classification and Variable Selection}},
author = {Zou, Hui},
booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics},
year = {2007},
pages = {675-681},
volume = {2},
url = {https://mlanthology.org/aistats/2007/zou2007aistats-improved/}
}