MaxGain: Regularisation of Neural Networks by Constraining Activation Magnitudes

Abstract

Effective regularisation of neural networks is essential to combat overfitting due to the large number of parameters involved. We present an empirical analogue to the Lipschitz constant of a feed-forward neural network, which we refer to as the maximum gain. We hypothesise that constraining the gain of a network will have a regularising effect, similar to how constraining the Lipschitz constant of a network has been shown to improve generalisation. A simple algorithm is provided that involves rescaling the weight matrix of each layer after each parameter update. We conduct a series of studies on common benchmark datasets, and also a novel dataset that we introduce to enable easier significance testing for experiments using convolutional networks. Performance on these datasets compares favourably with other common regularisation techniques.

Cite

Text

Gouk et al. "MaxGain: Regularisation of Neural Networks by Constraining Activation Magnitudes." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2018. doi:10.1007/978-3-030-10925-7_33

Markdown

[Gouk et al. "MaxGain: Regularisation of Neural Networks by Constraining Activation Magnitudes." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2018.](https://mlanthology.org/ecmlpkdd/2018/gouk2018ecmlpkdd-maxgain/) doi:10.1007/978-3-030-10925-7_33

BibTeX

@inproceedings{gouk2018ecmlpkdd-maxgain,
  title     = {{MaxGain: Regularisation of Neural Networks by Constraining Activation Magnitudes}},
  author    = {Gouk, Henry and Pfahringer, Bernhard and Frank, Eibe and Cree, Michael J.},
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2018},
  pages     = {541-556},
  doi       = {10.1007/978-3-030-10925-7_33},
  url       = {https://mlanthology.org/ecmlpkdd/2018/gouk2018ecmlpkdd-maxgain/}
}