Information Geometry of Mean-Field Approximation

Abstract

I present a general theory of mean-field approximation based on information geometry and applicable not only to Boltzmann machines but also to wider classes of statistical models. Using perturbation expansion of the Kullback divergence (or Plefka expansion in statistical physics), a formulation of mean-field approximation of general orders is derived. It includes in a natural way the “naive” mean-field approximation and is consistent with the Thouless-Anderson-Palmer (TAP) approach and the linear response theorem in statistical physics.

Cite

Text

Tanaka. "Information Geometry of Mean-Field Approximation." Neural Computation, 2000. doi:10.1162/089976600300015213

Markdown

[Tanaka. "Information Geometry of Mean-Field Approximation." Neural Computation, 2000.](https://mlanthology.org/neco/2000/tanaka2000neco-information/) doi:10.1162/089976600300015213

BibTeX

@article{tanaka2000neco-information,
  title     = {{Information Geometry of Mean-Field Approximation}},
  author    = {Tanaka, Toshiyuki},
  journal   = {Neural Computation},
  year      = {2000},
  pages     = {1951-1968},
  doi       = {10.1162/089976600300015213},
  volume    = {12},
  url       = {https://mlanthology.org/neco/2000/tanaka2000neco-information/}
}