Beta-Risk: A New Surrogate Risk for Learning from Weakly Labeled Data

Abstract

During the past few years, the machine learning community has paid attention to developping new methods for learning from weakly labeled data. This field covers different settings like semi-supervised learning, learning with label proportions, multi-instance learning, noise-tolerant learning, etc. This paper presents a generic framework to deal with these weakly labeled scenarios. We introduce the beta-risk as a generalized formulation of the standard empirical risk based on surrogate margin-based loss functions. This risk allows us to express the reliability on the labels and to derive different kinds of learning algorithms. We specifically focus on SVMs and propose a soft margin beta-svm algorithm which behaves better that the state of the art.

Cite

Text

Zantedeschi et al. "Beta-Risk: A New Surrogate Risk for Learning from Weakly Labeled Data." Neural Information Processing Systems, 2016.

Markdown

[Zantedeschi et al. "Beta-Risk: A New Surrogate Risk for Learning from Weakly Labeled Data." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/zantedeschi2016neurips-betarisk/)

BibTeX

@inproceedings{zantedeschi2016neurips-betarisk,
  title     = {{Beta-Risk: A New Surrogate Risk for Learning from Weakly Labeled Data}},
  author    = {Zantedeschi, Valentina and Emonet, Rémi and Sebban, Marc},
  booktitle = {Neural Information Processing Systems},
  year      = {2016},
  pages     = {4365-4373},
  url       = {https://mlanthology.org/neurips/2016/zantedeschi2016neurips-betarisk/}
}