The Pessimistic Limits and Possibilities of Margin-Based Losses in Semi-Supervised Learning
Abstract
Consider a classification problem where we have both labeled and unlabeled data available. We show that for linear classifiers defined by convex margin-based surrogate losses that are decreasing, it is impossible to construct \emph{any} semi-supervised approach that is able to guarantee an improvement over the supervised classifier measured by this surrogate loss on the labeled and unlabeled data. For convex margin-based loss functions that also increase, we demonstrate safe improvements \emph{are} possible.
Cite
Text
Krijthe and Loog. "The Pessimistic Limits and Possibilities of Margin-Based Losses in Semi-Supervised Learning." Neural Information Processing Systems, 2018.Markdown
[Krijthe and Loog. "The Pessimistic Limits and Possibilities of Margin-Based Losses in Semi-Supervised Learning." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/krijthe2018neurips-pessimistic/)BibTeX
@inproceedings{krijthe2018neurips-pessimistic,
title = {{The Pessimistic Limits and Possibilities of Margin-Based Losses in Semi-Supervised Learning}},
author = {Krijthe, Jesse and Loog, Marco},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {1790-1799},
url = {https://mlanthology.org/neurips/2018/krijthe2018neurips-pessimistic/}
}