Chain-of-Thought Reasoning Without Prompting

Abstract

In enhancing the reasoning capabilities of large language models (LLMs), prior research primarily focuses on specific prompting techniques such as few-shot or zero-shot chain-of-thought (CoT) prompting. These methods, while effective, often involve manually intensive prompt engineering. Our study takes a novel approach by asking: Can LLMs reason effectively without any prompting? Our findings reveal that, intriguingly, CoT reasoning paths can be elicited from pre-trained LLMs by simply altering the \textit{decoding} process. Rather than conventional greedy decoding, we investigate the top-$k$ alternative tokens, uncovering that CoT paths are frequently inherent in these sequences. This approach not only bypasses the confounders of prompting but also allows us to assess the LLMs' \textit{intrinsic} reasoning abilities. Moreover, we observe that the presence of a CoT in the decoding path correlates with a higher confidence in the model's decoded answer. This confidence metric effectively differentiates between CoT and non-CoT paths. Extensive empirical studies on various reasoning benchmarks show that the proposed CoT-decoding effectively elicits reasoning capabilities from language models, which were previously obscured by standard greedy decoding.

Cite

Text

Wang and Zhou. "Chain-of-Thought Reasoning Without Prompting." Neural Information Processing Systems, 2024. doi:10.52202/079017-2123

Markdown

[Wang and Zhou. "Chain-of-Thought Reasoning Without Prompting." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/wang2024neurips-chainofthought/) doi:10.52202/079017-2123

BibTeX

@inproceedings{wang2024neurips-chainofthought,
  title     = {{Chain-of-Thought Reasoning Without Prompting}},
  author    = {Wang, Xuezhi and Zhou, Denny},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-2123},
  url       = {https://mlanthology.org/neurips/2024/wang2024neurips-chainofthought/}
}