Superclass-Conditional Gaussian Mixture Model for Learning Fine-Grained Embeddings

Abstract

Learning fine-grained embeddings is essential for extending the generalizability of models pre-trained on "coarse" labels (e.g., animals). It is crucial to fields for which fine-grained labeling (e.g., breeds of animals) is expensive, but fine-grained prediction is desirable, such as medicine. The dilemma necessitates adaptation of a "coarsely" pre-trained model to new tasks with a few "finer-grained" training labels. However, coarsely supervised pre-training tends to suppress intra-class variation, which is vital for cross-granularity adaptation. In this paper, we develop a training framework underlain by a novel superclass-conditional Gaussian mixture model (SCGM). SCGM imitates the generative process of samples from hierarchies of classes through latent variable modeling of the fine-grained subclasses. The framework is agnostic to the encoders and only adds a few distribution related parameters, thus is efficient, and flexible to different domains. The model parameters are learned end-to-end by maximum-likelihood estimation via a principled Expectation-Maximization algorithm. Extensive experiments on benchmark datasets and a real-life medical dataset indicate the effectiveness of our method.

Cite

Text

Ni et al. "Superclass-Conditional Gaussian Mixture Model for Learning Fine-Grained Embeddings." International Conference on Learning Representations, 2022.

Markdown

[Ni et al. "Superclass-Conditional Gaussian Mixture Model for Learning Fine-Grained Embeddings." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/ni2022iclr-superclassconditional/)

BibTeX

@inproceedings{ni2022iclr-superclassconditional,
  title     = {{Superclass-Conditional Gaussian Mixture Model for Learning Fine-Grained Embeddings}},
  author    = {Ni, Jingchao and Cheng, Wei and Chen, Zhengzhang and Asakura, Takayoshi and Soma, Tomoya and Kato, Sho and Chen, Haifeng},
  booktitle = {International Conference on Learning Representations},
  year      = {2022},
  url       = {https://mlanthology.org/iclr/2022/ni2022iclr-superclassconditional/}
}