Conditional Mean Field

Abstract

Despite all the attention paid to variational methods based on sum-product message passing (loopy belief propagation, tree-reweighted sum-product), these methods are still bound to inference on a small set of probabilistic models. Mean field approximations have been applied to a broader set of problems, but the solutions are often poor. We propose a new class of conditionally-specified variational approximations based on mean field theory. While not usable on their own, combined with sequential Monte Carlo they produce guaranteed improvements over conventional mean field. Moreover, experiments on a well-studied problem-- inferring the stable configurations of the Ising spin glass--show that the solutions can be significantly better than those obtained using sum-product-based methods.

Cite

Text

Carbonetto and Freitas. "Conditional Mean Field." Neural Information Processing Systems, 2006.

Markdown

[Carbonetto and Freitas. "Conditional Mean Field." Neural Information Processing Systems, 2006.](https://mlanthology.org/neurips/2006/carbonetto2006neurips-conditional/)

BibTeX

@inproceedings{carbonetto2006neurips-conditional,
  title     = {{Conditional Mean Field}},
  author    = {Carbonetto, Peter and Freitas, Nando D.},
  booktitle = {Neural Information Processing Systems},
  year      = {2006},
  pages     = {201-208},
  url       = {https://mlanthology.org/neurips/2006/carbonetto2006neurips-conditional/}
}