Improving OOD Generalization of Pre-Trained Encoders via Aligned Embedding-Space Ensembles

Abstract

The quality of self-supervised pre-trained embeddings on out-of-distribution (OOD) data is poor without fine-tuning. A straightforward and simple approach to improving the generalization of pre-trained representation to OOD data is the use of deep ensembles. However, obtaining an effective ensemble in the embedding space with only unlabeled data remains an unsolved problem. We first perform a theoretical analysis that reveals the relationship between individual hyperspherical embedding spaces in an ensemble. We then design a principled method to align these embedding spaces in an unsupervised manner. Experimental results on the MNIST dataset show that our embedding-space ensemble method improves pre-trained embedding quality on in-distribution and OOD data compared to single encoders.

Cite

Text

Peng et al. "Improving OOD Generalization of Pre-Trained Encoders via Aligned Embedding-Space Ensembles." NeurIPS 2024 Workshops: UniReps, 2024.

Markdown

[Peng et al. "Improving OOD Generalization of Pre-Trained Encoders via Aligned Embedding-Space Ensembles." NeurIPS 2024 Workshops: UniReps, 2024.](https://mlanthology.org/neuripsw/2024/peng2024neuripsw-improving-a/)

BibTeX

@inproceedings{peng2024neuripsw-improving-a,
  title     = {{Improving OOD Generalization of Pre-Trained Encoders via Aligned Embedding-Space Ensembles}},
  author    = {Peng, Shuman and Khoeini, Arash and Vaswani, Sharan and Ester, Martin},
  booktitle = {NeurIPS 2024 Workshops: UniReps},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/peng2024neuripsw-improving-a/}
}