Regularization of Neural Networks Using DropConnect

Abstract

We introduce DropConnect, a generalization of DropOut, for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recoginition benchmarks can be obtained by aggregating multiple DropConnect-trained models.

Cite

Text

Wan et al. "Regularization of Neural Networks Using DropConnect." International Conference on Machine Learning, 2013.

Markdown

[Wan et al. "Regularization of Neural Networks Using DropConnect." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/wan2013icml-regularization/)

BibTeX

@inproceedings{wan2013icml-regularization,
  title     = {{Regularization of Neural Networks Using DropConnect}},
  author    = {Wan, Li and Zeiler, Matthew and Zhang, Sixin and Le Cun, Yann and Fergus, Rob},
  booktitle = {International Conference on Machine Learning},
  year      = {2013},
  pages     = {1058-1066},
  volume    = {28},
  url       = {https://mlanthology.org/icml/2013/wan2013icml-regularization/}
}