Reversible Jump MCMC Simulated Annealing for Neural Networks
Abstract
We propose a novel reversible jump Markov chain Monte Carlo (MCMC) simulated annealing algorithm to optimize radial basis function (RBF) networks. This algorithm enables us to maximize the joint posterior distribution of the network parameters and the number of basis functions. It performs a global search in the joint space of the parameters and number of parameters, thereby surmounting the problem of local minima. We also show that by calibrating a Bayesian model, we can obtain the classical AIC, BIC and MDL model selection criteria within a penalized likelihood framework. Finally, we show theoretically and empirically that the algorithm converges to the modes of the full posterior distribution in an efficient way.
Cite
Text
Andrieu et al. "Reversible Jump MCMC Simulated Annealing for Neural Networks." Conference on Uncertainty in Artificial Intelligence, 2000.Markdown
[Andrieu et al. "Reversible Jump MCMC Simulated Annealing for Neural Networks." Conference on Uncertainty in Artificial Intelligence, 2000.](https://mlanthology.org/uai/2000/andrieu2000uai-reversible/)BibTeX
@inproceedings{andrieu2000uai-reversible,
title = {{Reversible Jump MCMC Simulated Annealing for Neural Networks}},
author = {Andrieu, Christophe and de Freitas, Nando and Doucet, Arnaud},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2000},
pages = {11-18},
url = {https://mlanthology.org/uai/2000/andrieu2000uai-reversible/}
}