Regularization Learning of Neural Networks for Generalization

Abstract

In this paper, we propose a learning method of neural networks based on the regularization method and analyze its generalization capability. In learning from examples, training samples are independently drawn from some unknown probability distribution. The goal of learning is minimizing the expected risk for future test samples, which are also drawn from the same distribution. The problem can be reduced to estimating the probability distribution with only samples, but it is generally ill-posed. In order to solve it stably, we use the regularization method. Regularization learning can be done in practice by increasing samples by adding appropriate amount of noise to the training samples. We estimate its generalization error, which is defined as a difference between the expected risk accomplished by the learning and the truly minimum expected risk. Assume p -dimensional density function is s -times differentiable for any variable. We show the mean square of the generalization error of regularization learning is given as Dn ^−2 s /(2s s+p ) where n is the number of samples and D is a constant dependent on the complexity of the neural network and the difficulty of the problem.

Cite

Text

Akaho. "Regularization Learning of Neural Networks for Generalization." International Conference on Algorithmic Learning Theory, 1992. doi:10.1007/3-540-57369-0_31

Markdown

[Akaho. "Regularization Learning of Neural Networks for Generalization." International Conference on Algorithmic Learning Theory, 1992.](https://mlanthology.org/alt/1992/akaho1992alt-regularization/) doi:10.1007/3-540-57369-0_31

BibTeX

@inproceedings{akaho1992alt-regularization,
  title     = {{Regularization Learning of Neural Networks for Generalization}},
  author    = {Akaho, Shotaro},
  booktitle = {International Conference on Algorithmic Learning Theory},
  year      = {1992},
  pages     = {99-110},
  doi       = {10.1007/3-540-57369-0_31},
  url       = {https://mlanthology.org/alt/1992/akaho1992alt-regularization/}
}