Enhancing Reasoning Capabilities of LLMs via Principled Synthetic Logic Corpus

Abstract

Large language models (LLMs) are capable of solving a wide range of tasks, yet they have struggled with reasoning.To address this, we propose $\textbf{Additional Logic Training (ALT)}$, which aims to enhance LLMs' reasoning capabilities by program-generated logical reasoning samples.We first establish principles for designing high-quality samples by integrating symbolic logic theory and previous empirical insights.Then, based on these principles, we construct a synthetic corpus named $\textbf{Formal} \ \textbf{Logic} \ \textbf{\textit{D}eduction} \ \textbf{\textit{D}iverse}$ (FLD$ _{\times2}$), comprising numerous samples of multi-step deduction with unknown facts, diverse reasoning rules, diverse linguistic expressions, and challenging distractors.Finally, we empirically show that ALT on FLD$ _{\times2}$ substantially enhances the reasoning capabilities of state-of-the-art LLMs, including LLaMA-3.1-70B.Improvements include gains of up to 30 points on logical reasoning benchmarks, up to 10 points on math and coding benchmarks, and 5 points on the benchmark suite BBH.

Cite

Text

Morishita et al. "Enhancing Reasoning Capabilities of LLMs via Principled Synthetic Logic Corpus." Neural Information Processing Systems, 2024. doi:10.52202/079017-2340

Markdown

[Morishita et al. "Enhancing Reasoning Capabilities of LLMs via Principled Synthetic Logic Corpus." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/morishita2024neurips-enhancing/) doi:10.52202/079017-2340

BibTeX

@inproceedings{morishita2024neurips-enhancing,
  title     = {{Enhancing Reasoning Capabilities of LLMs via Principled Synthetic Logic Corpus}},
  author    = {Morishita, Terufumi and Morio, Gaku and Yamaguchi, Atsuki and Sogawa, Yasuhiro},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-2340},
  url       = {https://mlanthology.org/neurips/2024/morishita2024neurips-enhancing/}
}