Analogical Chaining with Natural Language Instruction for Commonsense Reasoning
Abstract
Understanding commonsense reasoning is one of the core challenges of AI. We are exploring an approach inspired by cognitive science, called analogical chaining, to create cognitive systems that can perform commonsense reasoning. Just as rules are chained in deductive systems, multiple analogies build upon each other’s inferences in analogical chaining. The cases used in analogical chaining – called common sense units – are small, to provide inferential focus and broader transfer. Importantly, such common sense units can be learned via natural language instruction, thereby increasing the ease of extending such systems. This paper describes analogical chaining, natural language instruction via microstories, and some subtleties that arise in controlling reasoning. The utility of this technique is demonstrated by performance of an implemented system on problems from the Choice of Plausible Alternatives test of commonsense causal reasoning.
Cite
Text
Blass and Forbus. "Analogical Chaining with Natural Language Instruction for Commonsense Reasoning." AAAI Conference on Artificial Intelligence, 2017. doi:10.1609/AAAI.V31I1.11153Markdown
[Blass and Forbus. "Analogical Chaining with Natural Language Instruction for Commonsense Reasoning." AAAI Conference on Artificial Intelligence, 2017.](https://mlanthology.org/aaai/2017/blass2017aaai-analogical/) doi:10.1609/AAAI.V31I1.11153BibTeX
@inproceedings{blass2017aaai-analogical,
title = {{Analogical Chaining with Natural Language Instruction for Commonsense Reasoning}},
author = {Blass, Joseph A. and Forbus, Kenneth D.},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2017},
pages = {4357-4363},
doi = {10.1609/AAAI.V31I1.11153},
url = {https://mlanthology.org/aaai/2017/blass2017aaai-analogical/}
}