Preconditioned Crank-Nicolson Algorithms for Wide Bayesian Neural Networks

Abstract

Bayesian Neural Networks represent a fascinating confluence of deep learning techniques and probabilistic reasoning, offering a compelling framework for understanding uncertainty in complex predictive models. In this paper, we consider Bayesian Neural Networks with Gaussian initialization and we investigate the use of the preconditioned Crank-Nicolson algorithm to sample from the reparametrized posterior distribution of the weights as the width of the network grows. In addition to being robust in the infinite-dimensional setting, we prove that the acceptance probability of the preconditioned Crank-Nicolson sampler approaches 1 as the width of the network goes to infinity, independently of any stepsize tuning. We then compare how the efficiency of the Langevin Monte Carlo, the preconditioned Crank-Nicolson and the preconditioned Crank-Nicolson Langevin samplers are influenced by changes in the network width in some real-world cases. In particular, we demonstrate that in wide Bayesian Neural Networks configurations, the proposed method allows for more efficient sampling, as evidenced by a higher effective sample size and improved diagnostic results compared with the Langevin Monte Carlo algorithm.

Cite

Text

Pezzetti et al. "Preconditioned Crank-Nicolson Algorithms for Wide Bayesian Neural Networks." NeurIPS 2024 Workshops: BDU, 2024.

Markdown

[Pezzetti et al. "Preconditioned Crank-Nicolson Algorithms for Wide Bayesian Neural Networks." NeurIPS 2024 Workshops: BDU, 2024.](https://mlanthology.org/neuripsw/2024/pezzetti2024neuripsw-preconditioned/)

BibTeX

@inproceedings{pezzetti2024neuripsw-preconditioned,
  title     = {{Preconditioned Crank-Nicolson Algorithms for Wide Bayesian Neural Networks}},
  author    = {Pezzetti, Lucia and Favaro, Stefano and Peluchetti, Stefano},
  booktitle = {NeurIPS 2024 Workshops: BDU},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/pezzetti2024neuripsw-preconditioned/}
}