SCoRe: Submodular Combinatorial Representation Learning

Abstract

In this paper we introduce the SCoRe (Submodular Combinatorial Representation Learning) framework, a novel approach in representation learning that addresses inter-class bias and intra-class variance. SCoRe provides a new combinatorial viewpoint to representation learning, by introducing a family of loss functions based on set-based submodular information measures. We develop two novel combinatorial formulations for loss functions, using the Total Information and Total Correlation, that naturally minimize intra-class variance and inter-class bias. Several commonly used metric/contrastive learning loss functions like supervised contrastive loss, orthogonal projection loss, and N-pairs loss, are all instances of SCoRe, thereby underlining the versatility and applicability of SCoRe in a broad spectrum of learning scenarios. Novel objectives in SCoRe naturally model class-imbalance with up to 7.6% improvement in classification on CIFAR-10-LT, CIFAR-100-LT, MedMNIST, 2.1% on ImageNet-LT, and 19.4% in object detection on IDD and LVIS (v1.0), demonstrating its effectiveness over existing approaches.

Cite

Text

Majee et al. "SCoRe: Submodular Combinatorial Representation Learning." International Conference on Machine Learning, 2024.

Markdown

[Majee et al. "SCoRe: Submodular Combinatorial Representation Learning." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/majee2024icml-score/)

BibTeX

@inproceedings{majee2024icml-score,
  title     = {{SCoRe: Submodular Combinatorial Representation Learning}},
  author    = {Majee, Anay and Kothawade, Suraj Nandkishor and Killamsetty, Krishnateja and Iyer, Rishabh K},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {34327-34349},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/majee2024icml-score/}
}