On the Generalization Ability of Unsupervised Pretraining

Abstract

Recent advances in unsupervised learning have shown that unsupervised pre-training, followed by fine-tuning, can improve model generalization. However, a rigorous understanding of how the representation function learned on an unlabeled dataset affects the generalization of the fine-tuned model is lacking. Existing theoretical research does not adequately account for the heterogeneity of the distribution and tasks in pre-training and fine-tuning stage. To bridge this gap, this paper introduces a novel theoretical framework that illuminates the critical factor influencing the transferability of knowledge acquired during unsupervised pre-training to the subsequent fine-tuning phase, ultimately affecting the generalization capabilities of the fine-tuned model on downstream tasks. We apply our theoretical framework to analyze generalization bound of two distinct scenarios: Context Encoder pre-training with deep neural networks and Masked Autoencoder pre-training with deep transformers, followed by fine-tuning on a binary classification task. Finally, inspired by our findings, we propose a novel regularization method during pre-training to further enhances the generalization of fine-tuned model. Overall, our results contribute to a better understanding of unsupervised pre-training and fine-tuning paradigm, and can shed light on the design of more effective pre-training algorithms.

Cite

Text

Deng et al. "On the Generalization Ability of Unsupervised Pretraining." Artificial Intelligence and Statistics, 2024.

Markdown

[Deng et al. "On the Generalization Ability of Unsupervised Pretraining." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/deng2024aistats-generalization/)

BibTeX

@inproceedings{deng2024aistats-generalization,
  title     = {{On the Generalization Ability of Unsupervised Pretraining}},
  author    = {Deng, Yuyang and Hong, Junyuan and Zhou, Jiayu and Mahdavi, Mehrdad},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2024},
  pages     = {4519-4527},
  volume    = {238},
  url       = {https://mlanthology.org/aistats/2024/deng2024aistats-generalization/}
}