Towards Verifiable Text Generation with Generative Agent

Abstract

Text generation with citations makes it easy to verify the factuality of Large Language Models’ (LLMs) generations. Existing one-step generation studies expose distinct shortages in answer refinement and in-context demonstration matching. In light of these challenges, we propose R2-MGA, a Retrieval and Reflection Memory-augmented Generative Agent. Specifically, it first retrieves the memory bank to obtain the best-matched memory snippet, then reflects the retrieved snippet as a reasoning rationale, next combines the snippet and the rationale as the best-matched in-context demonstration. Additionally, it is capable of in-depth answer refinement with two specifically designed modules. We evaluate R2-MGA across five LLMs on the ALCE benchmark. The results reveal R2-MGA’ exceptional capabilities in text generation with citations. In particular, compared to the selected baselines, it delivers up to +58.8% and +154.7% relative performance gains on answer correctness and citation quality, respectively. Extensive analyses strongly support the motivations of R2-MGA.

Cite

Text

Ji et al. "Towards Verifiable Text Generation with Generative Agent." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I23.34599

Markdown

[Ji et al. "Towards Verifiable Text Generation with Generative Agent." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/ji2025aaai-verifiable/) doi:10.1609/AAAI.V39I23.34599

BibTeX

@inproceedings{ji2025aaai-verifiable,
  title     = {{Towards Verifiable Text Generation with Generative Agent}},
  author    = {Ji, Bin and Liu, Huijun and Du, Mingzhe and Li, Shasha and Liu, Xiaodong and Ma, Jun and Yu, Jie and Ng, See-Kiong},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {24230-24238},
  doi       = {10.1609/AAAI.V39I23.34599},
  url       = {https://mlanthology.org/aaai/2025/ji2025aaai-verifiable/}
}