VC Theory of Large Margin Multi-Category Classifiers
Abstract
In the context of discriminant analysis, Vapnik's statistical learning theory has mainly been developed in three directions: the computation of dichotomies with binary-valued functions, the computation of dichotomies with real-valued functions, and the computation of polytomies with functions taking their values in finite sets, typically the set of categories itself. The case of classes of vector-valued functions used to compute polytomies has seldom been considered independently, which is unsatisfactory, for three main reasons. First, this case encompasses the other ones. Second, it cannot be treated appropriately through a naïve extension of the results devoted to the computation of dichotomies. Third, most of the classification problems met in practice involve multiple categories.
Cite
Text
Guermeur. "VC Theory of Large Margin Multi-Category Classifiers." Journal of Machine Learning Research, 2007.Markdown
[Guermeur. "VC Theory of Large Margin Multi-Category Classifiers." Journal of Machine Learning Research, 2007.](https://mlanthology.org/jmlr/2007/guermeur2007jmlr-vc/)BibTeX
@article{guermeur2007jmlr-vc,
title = {{VC Theory of Large Margin Multi-Category Classifiers}},
author = {Guermeur, Yann},
journal = {Journal of Machine Learning Research},
year = {2007},
pages = {2551-2594},
volume = {8},
url = {https://mlanthology.org/jmlr/2007/guermeur2007jmlr-vc/}
}