The Fundamental Incompatibility of Scalable Hamiltonian Monte Carlo and Naive Data Subsampling
Abstract
Leveraging the coherent exploration of Hamiltonian flow, Hamiltonian Monte Carlo produces computationally efficient Monte Carlo estimators, even with respect to complex and high-dimensional target distributions. When confronted with data-intensive applications, however, the algorithm may be too expensive to implement, leaving us to consider the utility of approximations such as data subsampling. In this paper I demonstrate how data subsampling fundamentally compromises the scalability of Hamiltonian Monte Carlo.
Cite
Text
Betancourt. "The Fundamental Incompatibility of Scalable Hamiltonian Monte Carlo and Naive Data Subsampling." International Conference on Machine Learning, 2015.Markdown
[Betancourt. "The Fundamental Incompatibility of Scalable Hamiltonian Monte Carlo and Naive Data Subsampling." International Conference on Machine Learning, 2015.](https://mlanthology.org/icml/2015/betancourt2015icml-fundamental/)BibTeX
@inproceedings{betancourt2015icml-fundamental,
title = {{The Fundamental Incompatibility of Scalable Hamiltonian Monte Carlo and Naive Data Subsampling}},
author = {Betancourt, Michael},
booktitle = {International Conference on Machine Learning},
year = {2015},
pages = {533-540},
volume = {37},
url = {https://mlanthology.org/icml/2015/betancourt2015icml-fundamental/}
}