Particle Gibbs with Ancestor Sampling for Probabilistic Programs
Abstract
Particle Markov chain Monte Carlo techniques rank among current state-of-the-art methods for probabilistic program inference. A drawback of these techniques is that they rely on importance resampling, which results in degenerate particle trajectories and a low effective sample size for variables sampled early in a program. We here develop a formalism to adapt ancestor resampling, a technique that mitigates particle degeneracy, to the probabilistic programming setting. We present empirical results that demonstrate nontrivial performance gains.
Cite
Text
van de Meent et al. "Particle Gibbs with Ancestor Sampling for Probabilistic Programs." International Conference on Artificial Intelligence and Statistics, 2015.Markdown
[van de Meent et al. "Particle Gibbs with Ancestor Sampling for Probabilistic Programs." International Conference on Artificial Intelligence and Statistics, 2015.](https://mlanthology.org/aistats/2015/vandemeent2015aistats-particle/)BibTeX
@inproceedings{vandemeent2015aistats-particle,
title = {{Particle Gibbs with Ancestor Sampling for Probabilistic Programs}},
author = {van de Meent, Jan-Willem and Yang, Hongseok and Mansinghka, Vikash and Wood, Frank D.},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2015},
url = {https://mlanthology.org/aistats/2015/vandemeent2015aistats-particle/}
}