TLDR: Twin Learning for Dimensionality Reduction

Abstract

Dimensionality reduction methods are unsupervised approaches which learn low-dimensional spaces where some properties of the initial space, typically the notion of “neighborhood”, are preserved. Such methods usually require propagation on large k-NN graphs or complicated optimization solvers. On the other hand, self-supervised learning approaches, typically used to learn representations from scratch, rely on simple and more scalable frameworks for learning. In this paper, we propose TLDR, a dimensionality reduction method for generic input spaces that is porting the recent self-supervised learning framework of Zbontar et al. (2021) to the specific task of dimensionality reduction, over arbitrary representations. We propose to use nearest neighbors to build pairs from a training set and a redundancy reduction loss to learn an encoder that produces representations invariant across such pairs. TLDR is a method that is simple, easy to train, and of broad applicability; it consists of an offline nearest neighbor computation step that can be highly approximated, and a straightforward learning process. Aiming for scalability, we focus on improving linear dimensionality reduction, and show consistent gains on image and document retrieval tasks, e.g. gaining +4% mAP over PCA on ROxford for GeM- AP, improving the performance of DINO on ImageNet or retaining it with a 10× compression.

Cite

Text

Kalantidis et al. "TLDR: Twin Learning for Dimensionality Reduction." Transactions on Machine Learning Research, 2022.

Markdown

[Kalantidis et al. "TLDR: Twin Learning for Dimensionality Reduction." Transactions on Machine Learning Research, 2022.](https://mlanthology.org/tmlr/2022/kalantidis2022tmlr-tldr/)

BibTeX

@article{kalantidis2022tmlr-tldr,
  title     = {{TLDR: Twin Learning for Dimensionality Reduction}},
  author    = {Kalantidis, Yannis and Lassance, Carlos Eduardo Rosar Kos and Almazán, Jon and Larlus, Diane},
  journal   = {Transactions on Machine Learning Research},
  year      = {2022},
  url       = {https://mlanthology.org/tmlr/2022/kalantidis2022tmlr-tldr/}
}