Multi-Class SVMs: From Tighter Data-Dependent Generalization Bounds to Novel Algorithms
Abstract
This paper studies the generalization performance of multi-class classification algorithms, for which we obtain, for the first time, a data-dependent generalization error bound with a logarithmic dependence on the class size, substantially improving the state-of-the-art linear dependence in the existing data-dependent generalization analysis. The theoretical analysis motivates us to introduce a new multi-class classification machine based on lp-norm regularization, where the parameter p controls the complexity of the corresponding bounds. We derive an efficient optimization algorithm based on Fenchel duality theory. Benchmarks on several real-world datasets show that the proposed algorithm can achieve significant accuracy gains over the state of the art.
Cite
Text
Lei et al. "Multi-Class SVMs: From Tighter Data-Dependent Generalization Bounds to Novel Algorithms." Neural Information Processing Systems, 2015.Markdown
[Lei et al. "Multi-Class SVMs: From Tighter Data-Dependent Generalization Bounds to Novel Algorithms." Neural Information Processing Systems, 2015.](https://mlanthology.org/neurips/2015/lei2015neurips-multiclass/)BibTeX
@inproceedings{lei2015neurips-multiclass,
title = {{Multi-Class SVMs: From Tighter Data-Dependent Generalization Bounds to Novel Algorithms}},
author = {Lei, Yunwen and Dogan, Urun and Binder, Alexander and Kloft, Marius},
booktitle = {Neural Information Processing Systems},
year = {2015},
pages = {2035-2043},
url = {https://mlanthology.org/neurips/2015/lei2015neurips-multiclass/}
}