Why Does Unsupervised Pre-Training Help Deep Learning?

Abstract

Much recent research has been devoted to learning algorithms for deep architectures such as Deep Belief Networks and stacks of auto-encoder variants with impressive results being obtained in several areas, mostly on vision and language datasets. The best results obtained on supervised learning tasks often involve an unsupervised learning component, usually in an unsupervised pre-training phase. The main question investigated here is the following: why does unsupervised pre-training work so well? Through extensive experimentation, we explore several possible explanations discussed in the literature including its action as a regularizer (Erhan et al. 2009) and as an aid to optimization (Bengio et al. 2007). Our results build on the work of Erhan et al. 2009, showing that unsupervised pre-training appears to play predominantly a regularization role in subsequent supervised training. However our results in an online setting, with a virtually unlimited data stream, point to a somewhat more nuanced interpretation of the roles of optimization and regularization in the unsupervised pre-training effect.

Cite

Text

Erhan et al. "Why Does Unsupervised Pre-Training Help Deep Learning?." Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 2010.

Markdown

[Erhan et al. "Why Does Unsupervised Pre-Training Help Deep Learning?." Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 2010.](https://mlanthology.org/aistats/2010/erhan2010aistats-unsupervised/)

BibTeX

@inproceedings{erhan2010aistats-unsupervised,
  title     = {{Why Does Unsupervised Pre-Training Help Deep Learning?}},
  author    = {Erhan, Dumitru and Courville, Aaron and Bengio, Yoshua and Vincent, Pascal},
  booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics},
  year      = {2010},
  pages     = {201-208},
  volume    = {9},
  url       = {https://mlanthology.org/aistats/2010/erhan2010aistats-unsupervised/}
}