Self-Supervised Label Augmentation via Input Transformations

Abstract

Self-supervised learning, which learns by constructing artificial labels given only the input signals, has recently gained considerable attention for learning representations with unlabeled datasets, i.e., learning without any human-annotated supervision. In this paper, we show that such a technique can be used to significantly improve the model accuracy even under fully-labeled datasets. Our scheme trains the model to learn both original and self-supervised tasks, but is different from conventional multi-task learning frameworks that optimize the summation of their corresponding losses. Our main idea is to learn a single unified task with respect to the joint distribution of the original and self-supervised labels, i.e., we augment original labels via self-supervision. This simple, yet effective approach allows to train models easier by relaxing a certain invariant constraint during learning the original and self-supervised tasks simultaneously. It also enables an aggregated inference which combines the predictions from different augmentations to improve the prediction accuracy. Furthermore, we propose a novel knowledge transfer technique, which we refer to as self-distillation, that has the effect of the aggregated inference in a single (faster) inference. We demonstrate the large accuracy improvement and wide applicability of our framework on various fully-supervised settings, e.g., the few-shot and imbalanced classification scenarios.

Cite

Text

Lee et al. "Self-Supervised Label Augmentation via Input Transformations." International Conference on Machine Learning, 2020.

Markdown

[Lee et al. "Self-Supervised Label Augmentation via Input Transformations." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/lee2020icml-selfsupervised/)

BibTeX

@inproceedings{lee2020icml-selfsupervised,
  title     = {{Self-Supervised Label Augmentation via Input Transformations}},
  author    = {Lee, Hankook and Hwang, Sung Ju and Shin, Jinwoo},
  booktitle = {International Conference on Machine Learning},
  year      = {2020},
  pages     = {5714-5724},
  volume    = {119},
  url       = {https://mlanthology.org/icml/2020/lee2020icml-selfsupervised/}
}