Semi-Supervised Robust Deep Neural Networks for Multi-Label Classification
Abstract
In this paper, we propose a robust method for semi-supervised training of deep neural networks for multi-label image classification. To this end, we use ramp loss, which is more robust against noisy and incomplete image labels compared to the classical hinge loss. The proposed method allows for learning from both labeled and unlabeled data in a semi-supervised learning setting. This is achieved by propagating labels from the labeled images to their unlabeled neighbors. Using a robust loss function be- comes crucial here, as the initial label propagations may include many errors, which degrades the performance of non-robust loss functions. In contrast, the proposed robust ramp loss restricts extreme penalties for the samples with incorrect labels, and the label assignment improves in each iteration and contributes to the learning process. The proposed method achieves state-of-the-art results in semi-supervised learning experiments on the CIFAR-10 and STL-10 datasets, and comparable results to the state-of the-art in supervised learning experiments on the NUS-WIDE and MS-COCO datasets.
Cite
Text
Cevikalp et al. "Semi-Supervised Robust Deep Neural Networks for Multi-Label Classification." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019.Markdown
[Cevikalp et al. "Semi-Supervised Robust Deep Neural Networks for Multi-Label Classification." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019.](https://mlanthology.org/cvprw/2019/cevikalp2019cvprw-semisupervised/)BibTeX
@inproceedings{cevikalp2019cvprw-semisupervised,
title = {{Semi-Supervised Robust Deep Neural Networks for Multi-Label Classification}},
author = {Cevikalp, Hakan and Benligiray, Burak and Gerek, Ömer Nezih and Saribas, Hasan},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2019},
pages = {9-17},
url = {https://mlanthology.org/cvprw/2019/cevikalp2019cvprw-semisupervised/}
}