Can Stories Help LLMs Reason? Curating Information Space Through Narrative

Abstract

Narrative is widely recognized as a powerful tool for structuring information and facilitating comprehension of complex ideas in various domains, such as science communication. This paper investigates whether incorporating narrative elements can assist Large Language Models (LLMs) in solving complex tasks more effectively. We propose a novel approach, Story of Thought (SoT), integrating narrative structures into prompting techniques for problem-solving tasks. This approach involves constructing narratives around problem statements and creating a framework to identify and organize relevant information. We hypothesize that this narrative-based information curation process enhances problem comprehension by contextualizing critical information and highlighting causal relationships within the problem space. Our experimental results show that the SoT approach consistently surpasses Chain of Thought (CoT) and Analogical Reasoning in GPQA tasks, achieving higher accuracy and better solutions in physics, chemistry, and biology problem-solving tasks with all tested OpenAI, Meta, and Mistral LLMs.

Cite

Text

Javadi et al. "Can Stories Help LLMs Reason? Curating Information Space Through Narrative." NeurIPS 2024 Workshops: Sys2-Reasoning, 2024.

Markdown

[Javadi et al. "Can Stories Help LLMs Reason? Curating Information Space Through Narrative." NeurIPS 2024 Workshops: Sys2-Reasoning, 2024.](https://mlanthology.org/neuripsw/2024/javadi2024neuripsw-stories/)

BibTeX

@inproceedings{javadi2024neuripsw-stories,
  title     = {{Can Stories Help LLMs Reason? Curating Information Space Through Narrative}},
  author    = {Javadi, Vahid Sadiri and Trippas, Johanne and Flek, Lucie},
  booktitle = {NeurIPS 2024 Workshops: Sys2-Reasoning},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/javadi2024neuripsw-stories/}
}