Dropout: Explicit Forms and Capacity Control
Abstract
We investigate the capacity control provided by dropout in various machine learning problems. First, we study dropout for matrix sensing, where it induces a data-dependent regularizer that, in expectation, equals the weighted trace-norm of the product of the factors. In deep learning, we show that the data-dependent regularizer due to dropout directly controls the Rademacher complexity of the underlying class of deep neural networks. These developments enable us to give concrete generalization error bounds for the dropout algorithm in both matrix completion as well as training deep neural networks. We evaluate our theoretical findings on real-world datasets, including MovieLens, Fashion MNIST, and CIFAR-10.
Cite
Text
Arora et al. "Dropout: Explicit Forms and Capacity Control." International Conference on Learning Representations, 2020.Markdown
[Arora et al. "Dropout: Explicit Forms and Capacity Control." International Conference on Learning Representations, 2020.](https://mlanthology.org/iclr/2020/arora2020iclr-dropout/)BibTeX
@inproceedings{arora2020iclr-dropout,
title = {{Dropout: Explicit Forms and Capacity Control}},
author = {Arora, Raman and Bartlett, Peter and Mianjy, Poorya and Srebro, Nathan},
booktitle = {International Conference on Learning Representations},
year = {2020},
url = {https://mlanthology.org/iclr/2020/arora2020iclr-dropout/}
}