The Importance of Causal Structure and Facts in Evaluating Explanations
Abstract
Explanation-based Learning often requires that one among many incomplete, competing explanations is chosen. The paper describes an experiment in which human subjects ranked explanations that differ with respect to characteristics applicable in Explanation-based Learning. The two characteristics that were varied in the experiment were: causal chaining of rules used in the explanation, and the degree of overlap between the facts in the example and the facts in the explanation. The results of the experiment show that people prefer explanations with good causal structure. Explanations that use more facts from the example are preferred over those that use less facts, but the fact overlap is more important when the causal structure is of high quality.
Cite
Text
Gick and Matwin. "The Importance of Causal Structure and Facts in Evaluating Explanations." International Conference on Machine Learning, 1991. doi:10.1016/B978-1-55860-200-7.50014-3Markdown
[Gick and Matwin. "The Importance of Causal Structure and Facts in Evaluating Explanations." International Conference on Machine Learning, 1991.](https://mlanthology.org/icml/1991/gick1991icml-importance/) doi:10.1016/B978-1-55860-200-7.50014-3BibTeX
@inproceedings{gick1991icml-importance,
title = {{The Importance of Causal Structure and Facts in Evaluating Explanations}},
author = {Gick, Mary and Matwin, Stan},
booktitle = {International Conference on Machine Learning},
year = {1991},
pages = {51-54},
doi = {10.1016/B978-1-55860-200-7.50014-3},
url = {https://mlanthology.org/icml/1991/gick1991icml-importance/}
}