Multi-Label Co-Training

Abstract

Multi-label learning aims at assigning a set of appropriate labels to multi-label samples.  Although it has been successfully applied in various domains in recent years, most multi-label learning methods require sufficient labeled training samples, because of the large number of possible label sets.  Co-training, as an important branch of semi-supervised learning, can leverage unlabeled samples, along with scarce labeled ones, and can potentially help with the large labeled data requirement. However, it is a difficult challenge to combine multi-label learning with co-training. Two distinct issues are associated with the challenge: (i) how to solve the widely-witnessed class-imbalance problem in multi-label learning; and (ii) how to select samples with confidence, and  communicate their predicted labels among  classifiers for model refinement. To address these issues, we introduce an approach called Multi-Label Co-Training (MLCT). MLCT leverages information concerning the co-occurrence  of pairwise labels to address the class-imbalance challenge; it introduces a predictive reliability measure to select samples, and applies label-wise filtering to confidently communicate labels of selected samples among co-training classifiers.  MLCT performs favorably against related competitive multi-label learning methods on benchmark datasets and it is also robust to the input parameters.

Cite

Text

Xing et al. "Multi-Label Co-Training." International Joint Conference on Artificial Intelligence, 2018. doi:10.24963/IJCAI.2018/400

Markdown

[Xing et al. "Multi-Label Co-Training." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/xing2018ijcai-multi/) doi:10.24963/IJCAI.2018/400

BibTeX

@inproceedings{xing2018ijcai-multi,
  title     = {{Multi-Label Co-Training}},
  author    = {Xing, Yuying and Yu, Guoxian and Domeniconi, Carlotta and Wang, Jun and Zhang, Zili},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2018},
  pages     = {2882-2888},
  doi       = {10.24963/IJCAI.2018/400},
  url       = {https://mlanthology.org/ijcai/2018/xing2018ijcai-multi/}
}