Message-Passing for Approximate MAP Inference with Latent Variables
Abstract
We consider a general inference setting for discrete probabilistic graphical models where we seek maximum a posteriori (MAP) estimates for a subset of the random variables (max nodes), marginalizing over the rest (sum nodes). We present a hybrid message-passing algorithm to accomplish this. The hybrid algorithm passes a mix of sum and max messages depending on the type of source node (sum or max). We derive our algorithm by showing that it falls out as the solution of a particular relaxation of a variational framework. We further show that the Expectation Maximization algorithm can be seen as an approximation to our algorithm. Experimental results on synthetic and real-world datasets, against several baselines, demonstrate the efficacy of our proposed algorithm.
Cite
Text
Jiang et al. "Message-Passing for Approximate MAP Inference with Latent Variables." Neural Information Processing Systems, 2011.Markdown
[Jiang et al. "Message-Passing for Approximate MAP Inference with Latent Variables." Neural Information Processing Systems, 2011.](https://mlanthology.org/neurips/2011/jiang2011neurips-messagepassing/)BibTeX
@inproceedings{jiang2011neurips-messagepassing,
title = {{Message-Passing for Approximate MAP Inference with Latent Variables}},
author = {Jiang, Jiarong and Rai, Piyush and Daume, Hal},
booktitle = {Neural Information Processing Systems},
year = {2011},
pages = {1197-1205},
url = {https://mlanthology.org/neurips/2011/jiang2011neurips-messagepassing/}
}