Density Propagation and Improved Bounds on the Partition Function

Abstract

Given a probabilistic graphical model, its density of states is a function that, for any likelihood value, gives the number of configurations with that probability. We introduce a novel message-passing algorithm called Density Propagation (DP) for estimating this function. We show that DP is exact for tree-structured graphical models and is, in general, a strict generalization of both sum-product and max-product algorithms. Further, we use density of states and tree decomposition to introduce a new family of upper and lower bounds on the partition function. For any tree decompostion, the new upper bound based on finer-grained density of state information is provably at least as tight as previously known bounds based on convexity of the log-partition function, and strictly stronger if a general condition holds. We conclude with empirical evidence of improvement over convex relaxations and mean-field based bounds.

Cite

Text

Ermon et al. "Density Propagation and Improved Bounds on the Partition Function." Neural Information Processing Systems, 2012.

Markdown

[Ermon et al. "Density Propagation and Improved Bounds on the Partition Function." Neural Information Processing Systems, 2012.](https://mlanthology.org/neurips/2012/ermon2012neurips-density/)

BibTeX

@inproceedings{ermon2012neurips-density,
  title     = {{Density Propagation and Improved Bounds on the Partition Function}},
  author    = {Ermon, Stefano and Sabharwal, Ashish and Selman, Bart and Gomes, Carla P.},
  booktitle = {Neural Information Processing Systems},
  year      = {2012},
  pages     = {2762-2770},
  url       = {https://mlanthology.org/neurips/2012/ermon2012neurips-density/}
}