You Don’t Need Domain-Specific Data Augmentations When Scaling Self-Supervised Learning
Abstract
Self-Supervised learning (SSL) with Joint-Embedding Architectures (JEA) has led to outstanding performances. All instantiations of this paradigm were trained using strong and well-established hand-crafted data augmentations, leading to the general belief that they are required for the proper training and performance of such models. On the other hand, generative reconstruction-based models such as BEIT and MAE or Joint-Embedding Predictive Architectures such as I-JEPA have shown strong performance without using data augmentations except masking. In this work, we challenge the importance of invariance and data-augmentation in JEAs at scale. By running a case-study on a recent SSL foundation model -- DINOv2 -- we show that strong image representations can be obtained with JEAs and only cropping without resizing provided the training data is large enough, reaching state-of-the-art results and using the least amount of augmentation in the literature. Through this study, we also discuss the impact of compute constraints on the outcomes of experimental deep learning research, showing that they can lead to very different conclusions.
Cite
Text
Moutakanni et al. "You Don’t Need Domain-Specific Data Augmentations When Scaling Self-Supervised Learning." Neural Information Processing Systems, 2024. doi:10.52202/079017-3686Markdown
[Moutakanni et al. "You Don’t Need Domain-Specific Data Augmentations When Scaling Self-Supervised Learning." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/moutakanni2024neurips-you/) doi:10.52202/079017-3686BibTeX
@inproceedings{moutakanni2024neurips-you,
title = {{You Don’t Need Domain-Specific Data Augmentations When Scaling Self-Supervised Learning}},
author = {Moutakanni, Théo and Oquab, Maxime and Szafraniec, Marc and Vakalopoulou, Maria and Bojanowski, Piotr},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-3686},
url = {https://mlanthology.org/neurips/2024/moutakanni2024neurips-you/}
}