Information Theory and Statistical Mechanics
Abstract
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting subjective statistical mechanics, the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.
Cite
Text
Jaynes. "Information Theory and Statistical Mechanics." Physical Review, 1957. doi:10.1103/PhysRev.106.620Markdown
[Jaynes. "Information Theory and Statistical Mechanics." Physical Review, 1957.](https://mlanthology.org/misc/1957/jaynes1957misc-information/) doi:10.1103/PhysRev.106.620BibTeX
@misc{jaynes1957misc-information,
title = {{Information Theory and Statistical Mechanics}},
author = {Jaynes, Edwin T.},
howpublished = {Physical Review},
year = {1957},
pages = {620-630},
doi = {10.1103/PhysRev.106.620},
volume = {106},
number = {4},
url = {https://mlanthology.org/misc/1957/jaynes1957misc-information/}
}