Multi-Class Classification Without Multi-Class Labels

Abstract

This work presents a new strategy for multi-class classification that requires no class-specific labels, but instead leverages pairwise similarity between examples, which is a weaker form of annotation. The proposed method, meta classification learning, optimizes a binary classifier for pairwise similarity prediction and through this process learns a multi-class classifier as a submodule. We formulate this approach, present a probabilistic graphical model for it, and derive a surprisingly simple loss function that can be used to learn neural network-based models. We then demonstrate that this same framework generalizes to the supervised, unsupervised cross-task, and semi-supervised settings. Our method is evaluated against state of the art in all three learning paradigms and shows a superior or comparable accuracy, providing evidence that learning multi-class classification without multi-class labels is a viable learning option.

Cite

Text

Hsu et al. "Multi-Class Classification Without Multi-Class Labels." International Conference on Learning Representations, 2019.

Markdown

[Hsu et al. "Multi-Class Classification Without Multi-Class Labels." International Conference on Learning Representations, 2019.](https://mlanthology.org/iclr/2019/hsu2019iclr-multiclass/)

BibTeX

@inproceedings{hsu2019iclr-multiclass,
  title     = {{Multi-Class Classification Without Multi-Class Labels}},
  author    = {Hsu, Yen-Chang and Lv, Zhaoyang and Schlosser, Joel and Odom, Phillip and Kira, Zsolt},
  booktitle = {International Conference on Learning Representations},
  year      = {2019},
  url       = {https://mlanthology.org/iclr/2019/hsu2019iclr-multiclass/}
}