An Information-Theoretic Understanding of Maximum Manifold Capacity Representations

Abstract

Maximum Manifold Capacity Representations (MMCR) is a recent multi-view self-supervised learning (MVSSL) method that matches or surpasses other leading MVSSL methods. MMCR is interesting for at least two reasons. Firstly, MMCR is an oddity in the zoo of MVSSL methods: it is not (explicitly) contrastive, applies no masking, performs no clustering, leverages no distillation, and does not (explicitly) reduce redundancy. Secondly, while many self-supervised learning (SSL) methods originate in information theory, MMCR distinguishes itself by claiming a different origin: a statistical mechanical characterization of the geometry of linear separability of data manifolds. However, given the rich connections between statistical mechanics and information theory, and given recent work showing how many SSL methods can be understood from an information-theoretic perspective, we conjecture that MMCR can be similarly understood from an information-theoretic perspective. In this paper, we leverage tools from high dimensional probability and information theory to demonstrate that an optimal solution to MMCR's nuclear norm-based objective function is the same optimal solution that maximizes a well-known lower bound on mutual information.

Cite

Text

Isik et al. "An Information-Theoretic Understanding of Maximum Manifold Capacity Representations." NeurIPS 2023 Workshops: UniReps, 2023.

Markdown

[Isik et al. "An Information-Theoretic Understanding of Maximum Manifold Capacity Representations." NeurIPS 2023 Workshops: UniReps, 2023.](https://mlanthology.org/neuripsw/2023/isik2023neuripsw-informationtheoretic/)

BibTeX

@inproceedings{isik2023neuripsw-informationtheoretic,
  title     = {{An Information-Theoretic Understanding of Maximum Manifold Capacity Representations}},
  author    = {Isik, Berivan and Lecomte, Victor and Schaeffer, Rylan and LeCun, Yann and Khona, Mikail and Shwartz-Ziv, Ravid and Koyejo, Sanmi and Gromov, Andrey},
  booktitle = {NeurIPS 2023 Workshops: UniReps},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/isik2023neuripsw-informationtheoretic/}
}