Slice Sampling Particle Belief Propagation
Abstract
Inference in continuous label Markov random fields is a challenging task. We use particle belief propagation ( PBP ) for solving the inference problem in continuous label space. Sampling particles from the belief distribution is typically done by using Metropolis-Hastings ( MH ) Markov chain Monte Carlo ( MCMC ) methods which involves sampling from a proposal distribution. This proposal distribution has to be carefully designed depending on the particular model and input data to achieve fast convergence. We propose to avoid dependence on a proposal distribution by introducing a slice sampling based PBP algorithm. The proposed approach shows superior convergence performance on an image denoising toy example. Our findings are validated on a challenging relational 2D feature tracking application.
Cite
Text
Muller et al. "Slice Sampling Particle Belief Propagation." International Conference on Computer Vision, 2013. doi:10.1109/ICCV.2013.144Markdown
[Muller et al. "Slice Sampling Particle Belief Propagation." International Conference on Computer Vision, 2013.](https://mlanthology.org/iccv/2013/muller2013iccv-slice/) doi:10.1109/ICCV.2013.144BibTeX
@inproceedings{muller2013iccv-slice,
title = {{Slice Sampling Particle Belief Propagation}},
author = {Muller, Oliver and Yang, Michael Ying and Rosenhahn, Bodo},
booktitle = {International Conference on Computer Vision},
year = {2013},
doi = {10.1109/ICCV.2013.144},
url = {https://mlanthology.org/iccv/2013/muller2013iccv-slice/}
}