Top-K Multiclass SVM
Abstract
Class ambiguity is typical in image classification problems with a large number of classes. When classes are difficult to discriminate, it makes sense to allow k guesses and evaluate classifiers based on the top-k error instead of the standard zero-one loss. We propose top-k multiclass SVM as a direct method to optimize for top-k performance. Our generalization of the well-known multiclass SVM is based on a tight convex upper bound of the top-k error. We propose a fast optimization scheme based on an efficient projection onto the top-k simplex, which is of its own interest. Experiments on five datasets show consistent improvements in top-k accuracy compared to various baselines.
Cite
Text
Lapin et al. "Top-K Multiclass SVM." Neural Information Processing Systems, 2015.Markdown
[Lapin et al. "Top-K Multiclass SVM." Neural Information Processing Systems, 2015.](https://mlanthology.org/neurips/2015/lapin2015neurips-topk/)BibTeX
@inproceedings{lapin2015neurips-topk,
title = {{Top-K Multiclass SVM}},
author = {Lapin, Maksim and Hein, Matthias and Schiele, Bernt},
booktitle = {Neural Information Processing Systems},
year = {2015},
pages = {325-333},
url = {https://mlanthology.org/neurips/2015/lapin2015neurips-topk/}
}