Automatically Marginalized MCMC in Probabilistic Programming
Abstract
Hamiltonian Monte Carlo (HMC) is a powerful algorithm to sample latent variables from Bayesian models. The advent of probabilistic programming languages (PPLs) frees users from writing inference algorithms and lets users focus on modeling. However, many models are difficult for HMC to solve directly, and often require tricks like model reparameterization. We are motivated by the fact that many of those models could be simplified by marginalization. We propose to use automatic marginalization as part of the sampling process using HMC in a graphical model extracted from a PPL, which substantially improves sampling from real-world hierarchical models.
Cite
Text
Lai et al. "Automatically Marginalized MCMC in Probabilistic Programming." International Conference on Machine Learning, 2023.Markdown
[Lai et al. "Automatically Marginalized MCMC in Probabilistic Programming." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/lai2023icml-automatically/)BibTeX
@inproceedings{lai2023icml-automatically,
title = {{Automatically Marginalized MCMC in Probabilistic Programming}},
author = {Lai, Jinlin and Burroni, Javier and Guan, Hui and Sheldon, Daniel},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {18301-18318},
volume = {202},
url = {https://mlanthology.org/icml/2023/lai2023icml-automatically/}
}