I-Con: A Unifying Framework for Representation Learning
Abstract
As the field of representation learning grows, there has been a proliferation of different loss functions to solve different classes of problems. We introduce a single information-theoretic equation that generalizes a large collection of mod- ern loss functions in machine learning. In particular, we introduce a framework that shows that several broad classes of machine learning methods are precisely minimizing an integrated KL divergence between two conditional distributions: the supervisory and learned representations. This viewpoint exposes a hidden information geometry underlying clustering, spectral methods, dimensionality re- duction, contrastive learning, and supervised learning. This framework enables the development of new loss functions by combining successful techniques from across the literature. We not only present a wide array of proofs, connecting over 23 different approaches, but we also leverage these theoretical results to create state-of-the-art unsupervised image classifiers that achieve a +8% improvement over the prior state-of-the-art on unsupervised classification on ImageNet-1K. We also demonstrate that I-Con can be used to derive principled debiasing methods which improve contrastive representation learners.
Cite
Text
Alshammari et al. "I-Con: A Unifying Framework for Representation Learning." International Conference on Learning Representations, 2025.Markdown
[Alshammari et al. "I-Con: A Unifying Framework for Representation Learning." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/alshammari2025iclr-icon/)BibTeX
@inproceedings{alshammari2025iclr-icon,
title = {{I-Con: A Unifying Framework for Representation Learning}},
author = {Alshammari, Shaden Naif and Hershey, John R. and Feldmann, Axel and Freeman, William T. and Hamilton, Mark},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/alshammari2025iclr-icon/}
}