Semi-Separable Hamiltonian Monte Carlo for Inference in Bayesian Hierarchical Models
Abstract
Sampling from hierarchical Bayesian models is often difficult for MCMC methods, because of the strong correlations between the model parameters and the hyperparameters. Recent Riemannian manifold Hamiltonian Monte Carlo (RMHMC) methods have significant potential advantages in this setting, but are computationally expensive. We introduce a new RMHMC method, which we call semi-separable Hamiltonian Monte Carlo, which uses a specially designed mass matrix that allows the joint Hamiltonian over model parameters and hyperparameters to decompose into two simpler Hamiltonians. This structure is exploited by a new integrator which we call the alternating blockwise leapfrog algorithm. The resulting method can mix faster than simpler Gibbs sampling while being simpler and more efficient than previous instances of RMHMC.
Cite
Text
Zhang and Sutton. "Semi-Separable Hamiltonian Monte Carlo for Inference in Bayesian Hierarchical Models." Neural Information Processing Systems, 2014.Markdown
[Zhang and Sutton. "Semi-Separable Hamiltonian Monte Carlo for Inference in Bayesian Hierarchical Models." Neural Information Processing Systems, 2014.](https://mlanthology.org/neurips/2014/zhang2014neurips-semiseparable/)BibTeX
@inproceedings{zhang2014neurips-semiseparable,
title = {{Semi-Separable Hamiltonian Monte Carlo for Inference in Bayesian Hierarchical Models}},
author = {Zhang, Yichuan and Sutton, Charles},
booktitle = {Neural Information Processing Systems},
year = {2014},
pages = {10-18},
url = {https://mlanthology.org/neurips/2014/zhang2014neurips-semiseparable/}
}