Probability Update: Conditioning vs. Cross-Entropy
Abstract
Conditioning is the generally agreed-upon method for updating probability distributions when one learns that an event is certainly true. But it has been argued that we need other rules, in particular the rule of cross-entropy minimization, to handle updates that involve uncertain information. In this paper we re-examine such a case: van Fraassen's Judy Benjamin problem [1987], which in essence asks how one might update given the value of a conditional probability. We argue that--contrary to the suggestions in the literature--it is possible to use simple conditionalization in this case, and thereby obtain answers that agree fully with intuition. This contrasts with proposals such as cross-entropy, which are easier to apply but can give unsatisfactory answers. Based on the lessons from this example, we speculate on some general philosophical issues concerning probability update.
Cite
Text
Grove and Halpern. "Probability Update: Conditioning vs. Cross-Entropy." Conference on Uncertainty in Artificial Intelligence, 1997.Markdown
[Grove and Halpern. "Probability Update: Conditioning vs. Cross-Entropy." Conference on Uncertainty in Artificial Intelligence, 1997.](https://mlanthology.org/uai/1997/grove1997uai-probability/)BibTeX
@inproceedings{grove1997uai-probability,
title = {{Probability Update: Conditioning vs. Cross-Entropy}},
author = {Grove, Adam J. and Halpern, Joseph Y.},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {1997},
pages = {208-214},
url = {https://mlanthology.org/uai/1997/grove1997uai-probability/}
}