Global Optimisation of Neural Network Models via Sequential Sampling
Abstract
We propose a novel strategy for training neural networks using se(cid:173) quential sampling-importance resampling algorithms. This global optimisation strategy allows us to learn the probability distribu(cid:173) tion of the network weights in a sequential framework. It is well suited to applications involving on-line, nonlinear, non-Gaussian or non-stationary signal processing.
Cite
Text
de Freitas et al. "Global Optimisation of Neural Network Models via Sequential Sampling." Neural Information Processing Systems, 1998.Markdown
[de Freitas et al. "Global Optimisation of Neural Network Models via Sequential Sampling." Neural Information Processing Systems, 1998.](https://mlanthology.org/neurips/1998/defreitas1998neurips-global/)BibTeX
@inproceedings{defreitas1998neurips-global,
title = {{Global Optimisation of Neural Network Models via Sequential Sampling}},
author = {de Freitas, João F. G. and Niranjan, Mahesan and Doucet, Arnaud and Gee, Andrew H.},
booktitle = {Neural Information Processing Systems},
year = {1998},
pages = {410-416},
url = {https://mlanthology.org/neurips/1998/defreitas1998neurips-global/}
}