OpenCon: Open-World Contrastive Learning

Abstract

Machine learning models deployed in the wild naturally encounter unlabeled samples from both known and novel classes. Challenges arise in learning from both the labeled and unlabeled data, in an open-world semi-supervised manner. In this paper, we introduce a new learning framework, open-world contrastive learning (OpenCon). OpenCon tackles the challenges of learning compact representations for both known and novel classes and facilitates novelty discovery along the way. We demonstrate the effectiveness of OpenCon on challenging benchmark datasets and establish competitive performance. On the ImageNet dataset, OpenCon significantly outperforms the current best method by 11.9% and 7.4% on novel and overall classification accuracy, respectively. Theoretically, OpenCon can be rigorously interpreted from an EM algorithm perspective—minimizing our contrastive loss partially maximizes the likelihood by clustering similar samples in the embedding space. The code is available at https://github.com/deeplearning-wisc/opencon.

Cite

Text

Sun and Li. "OpenCon: Open-World Contrastive Learning." Transactions on Machine Learning Research, 2023.

Markdown

[Sun and Li. "OpenCon: Open-World Contrastive Learning." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/sun2023tmlr-opencon/)

BibTeX

@article{sun2023tmlr-opencon,
  title     = {{OpenCon: Open-World Contrastive Learning}},
  author    = {Sun, Yiyou and Li, Yixuan},
  journal   = {Transactions on Machine Learning Research},
  year      = {2023},
  url       = {https://mlanthology.org/tmlr/2023/sun2023tmlr-opencon/}
}