Hamiltonian Monte Carlo for Efficient Gaussian Sampling: Long and Random Steps
Abstract
Hamiltonian Monte Carlo (HMC) is a Markov chain algorithm for sampling from a high-dimensional distribution with density $e^{-f(x)}$, given access to the gradient of $f$. A particular case of interest is that of a $d$-dimensional Gaussian distribution with covariance matrix $\Sigma$, in which case $f(x) = x^\top \Sigma^{-1} x$. We show that Metropolis-adjusted HMC can sample from a distribution that is $\varepsilon$-close to a Gaussian in total variation distance using $\widetilde{O}(\sqrt{\kappa} d^{1/4} \log(1/\varepsilon))$ gradient queries, where $\varepsilon>0$ and $\kappa$ is the condition number of $\Sigma$. Our algorithm uses long and random integration times for the Hamiltonian dynamics, and it creates a warm start by first running HMC without a Metropolis adjustment. This contrasts with (and was motivated by) recent results that give an $\widetilde\Omega(\kappa d^{1/2})$ query lower bound for HMC with a fixed integration times or from a cold start, even for the Gaussian case.
Cite
Text
Apers et al. "Hamiltonian Monte Carlo for Efficient Gaussian Sampling: Long and Random Steps." Journal of Machine Learning Research, 2024.Markdown
[Apers et al. "Hamiltonian Monte Carlo for Efficient Gaussian Sampling: Long and Random Steps." Journal of Machine Learning Research, 2024.](https://mlanthology.org/jmlr/2024/apers2024jmlr-hamiltonian/)BibTeX
@article{apers2024jmlr-hamiltonian,
title = {{Hamiltonian Monte Carlo for Efficient Gaussian Sampling: Long and Random Steps}},
author = {Apers, Simon and Gribling, Sander and Szilágyi, Dániel},
journal = {Journal of Machine Learning Research},
year = {2024},
pages = {1-30},
volume = {25},
url = {https://mlanthology.org/jmlr/2024/apers2024jmlr-hamiltonian/}
}