Uncertain Reasoning Using Maximum Entropy Inference

Abstract

The use of maximum entropy inference in reasoning with uncertain information is commonly justified by an information-theoretic argument. This paper discusses a possible objection to this information-theoretic justification and shows how it can be met. I then compare maximum entropy inference with certain other currently popular methods for uncertain reasoning. In making such a comparison, one must distinguish between static and dynamic theories of degrees of belief: a static theory concerns the consistency conditions for degrees of belief at a given time; whereas a dynamic theory concerns how one's degrees of belief should change in the light of new information. It is argued that maximum entropy is a dynamic theory and that a complete theory of uncertain reasoning can be gotten by combining maximum entropy inference with probability theory, which is a static theory. This total theory, I argue, is much better grounded than are other theories of uncertain reasoning.

Cite

Text

Hunter. "Uncertain Reasoning Using Maximum Entropy Inference." Conference on Uncertainty in Artificial Intelligence, 1985. doi:10.1016/B978-0-444-70058-2.50019-X

Markdown

[Hunter. "Uncertain Reasoning Using Maximum Entropy Inference." Conference on Uncertainty in Artificial Intelligence, 1985.](https://mlanthology.org/uai/1985/hunter1985uai-uncertain/) doi:10.1016/B978-0-444-70058-2.50019-X

BibTeX

@inproceedings{hunter1985uai-uncertain,
  title     = {{Uncertain Reasoning Using Maximum Entropy Inference}},
  author    = {Hunter, Daniel},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {1985},
  pages     = {203-210},
  doi       = {10.1016/B978-0-444-70058-2.50019-X},
  url       = {https://mlanthology.org/uai/1985/hunter1985uai-uncertain/}
}