RuAG: Learned-Rule-Augmented Generation for Large Language Models

Abstract

In-context learning (ICL) and Retrieval-Augmented Generation (RAG) have gained attention for their ability to enhance LLMs' reasoning by incorporating external knowledge but suffer from limited contextual window size, leading to insufficient information injection. To this end, we propose a novel framework to automatically distill large volumes of offline data into interpretable first-order logic rules, which are injected into LLMs to boost their reasoning capabilities. Our method begins by formulating the search process relying on LLMs' commonsense, where LLMs automatically define head and body predicates. Then, we apply Monte Carlo Tree Search (MCTS) to address the combinational searching space and efficiently discover logic rules from data. The resulting logic rules are translated into natural language, allowing targeted knowledge injection and seamless integration into LLM prompts for LLM's downstream task reasoning. We evaluate our framework on public and private industrial tasks, including Natural Language Processing (NLP), time-series, decision-making, and industrial tasks, demonstrating its effectiveness in enhancing LLM's capability over diverse tasks.

Cite

Text

Zhang et al. "RuAG: Learned-Rule-Augmented Generation for Large Language Models." International Conference on Learning Representations, 2025.

Markdown

[Zhang et al. "RuAG: Learned-Rule-Augmented Generation for Large Language Models." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/zhang2025iclr-ruag/)

BibTeX

@inproceedings{zhang2025iclr-ruag,
  title     = {{RuAG: Learned-Rule-Augmented Generation for Large Language Models}},
  author    = {Zhang, Yudi and Xiao, Pei and Wang, Lu and Zhang, Chaoyun and Fang, Meng and Du, Yali and Puzyrev, Yevgeniy and Yao, Randolph and Qin, Si and Lin, Qingwei and Pechenizkiy, Mykola and Zhang, Dongmei and Rajmohan, Saravan and Zhang, Qi},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/zhang2025iclr-ruag/}
}