Bayesian Inference as Iterated Random Functions with Applications to Sequential Inference in Graphical Models
Abstract
We propose a general formalism of iterated random functions with semigroup property, under which exact and approximate Bayesian posterior updates can be viewed as specific instances. A convergence theory for iterated random functions is presented. As an application of the general theory we analyze convergence behaviors of exact and approximate message-passing algorithms that arise in a sequential change point detection problem formulated via a latent variable directed graphical model. The sequential inference algorithm and its supporting theory are illustrated by simulated examples.
Cite
Text
Amini and Nguyen. "Bayesian Inference as Iterated Random Functions with Applications to Sequential Inference in Graphical Models." Neural Information Processing Systems, 2013.Markdown
[Amini and Nguyen. "Bayesian Inference as Iterated Random Functions with Applications to Sequential Inference in Graphical Models." Neural Information Processing Systems, 2013.](https://mlanthology.org/neurips/2013/amini2013neurips-bayesian/)BibTeX
@inproceedings{amini2013neurips-bayesian,
title = {{Bayesian Inference as Iterated Random Functions with Applications to Sequential Inference in Graphical Models}},
author = {Amini, Arash and Nguyen, Xuanlong},
booktitle = {Neural Information Processing Systems},
year = {2013},
pages = {2922-2930},
url = {https://mlanthology.org/neurips/2013/amini2013neurips-bayesian/}
}