Convex Calibrated Surrogates for Low-Rank Loss Matrices with Applications to Subset Ranking Losses
Abstract
The design of convex, calibrated surrogate losses, whose minimization entails consistency with respect to a desired target loss, is an important concept to have emerged in the theory of machine learning in recent years. We give an explicit construction of a convex least-squares type surrogate loss that can be designed to be calibrated for any multiclass learning problem for which the target loss matrix has a low-rank structure; the surrogate loss operates on a surrogate target space of dimension at most the rank of the target loss. We use this result to design convex calibrated surrogates for a variety of subset ranking problems, with target losses including the precision@q, expected rank utility, mean average precision, and pairwise disagreement.
Cite
Text
Ramaswamy et al. "Convex Calibrated Surrogates for Low-Rank Loss Matrices with Applications to Subset Ranking Losses." Neural Information Processing Systems, 2013.Markdown
[Ramaswamy et al. "Convex Calibrated Surrogates for Low-Rank Loss Matrices with Applications to Subset Ranking Losses." Neural Information Processing Systems, 2013.](https://mlanthology.org/neurips/2013/ramaswamy2013neurips-convex/)BibTeX
@inproceedings{ramaswamy2013neurips-convex,
title = {{Convex Calibrated Surrogates for Low-Rank Loss Matrices with Applications to Subset Ranking Losses}},
author = {Ramaswamy, Harish G. and Agarwal, Shivani and Tewari, Ambuj},
booktitle = {Neural Information Processing Systems},
year = {2013},
pages = {1475-1483},
url = {https://mlanthology.org/neurips/2013/ramaswamy2013neurips-convex/}
}