Bootstrapping Neural Networks
Abstract
Knowledge about the distribution of a statistical estimator is important for various purposes, such as the construction of confidence intervals for model parameters or the determination of critical values of tests. A widely used method to estimate this distribution is the so-called bootstrap, which is based on an imitation of the probabilistic structure of the data-generating process on the basis of the information provided by a given set of random observations. In this article we investigate this classical method in the context of artificial neural networks used for estimating a mapping from input to output space. We establish consistency results for bootstrap estimates of the distribution of parameter estimates.
Cite
Text
Franke and Neumann. "Bootstrapping Neural Networks." Neural Computation, 2000. doi:10.1162/089976600300015204Markdown
[Franke and Neumann. "Bootstrapping Neural Networks." Neural Computation, 2000.](https://mlanthology.org/neco/2000/franke2000neco-bootstrapping/) doi:10.1162/089976600300015204BibTeX
@article{franke2000neco-bootstrapping,
title = {{Bootstrapping Neural Networks}},
author = {Franke, Jürgen and Neumann, Michael H.},
journal = {Neural Computation},
year = {2000},
pages = {1929-1949},
doi = {10.1162/089976600300015204},
volume = {12},
url = {https://mlanthology.org/neco/2000/franke2000neco-bootstrapping/}
}