ImProver: Agent-Based Automated Proof Optimization

Abstract

Large language models (LLMs) have been used to generate formal proofs of mathematical theorems in proofs assistants such as Lean. However, we often want to optimize a formal proof with respect to various criteria, depending on its downstream use. For example, we may want a proof to adhere to a certain style, be declaratively structured, or concise. Having suitably optimized proofs is also important for learning tasks, especially since human-written proofs may not optimal for that purpose. To this end, we study a new problem of automated proof optimization: rewriting a proof so that it is correct and optimizes for an arbitrary criterion, such as length or declarativity. As a first method for automated proof optimization, we present ImProver, a large-language-model agent that rewrites proofs to optimize arbitrary user-defined metrics in Lean. We find that naively applying LLMs to proof optimization falls short, and we incorporate various improvements into ImProver, such as the use of symbolic Lean context in a novel Chain-of-States technique, as well as error-correction and retrieval. We test ImProver on rewriting real-world undergraduate, competition, and research-level mathematics theorems, finding that ImProver is capable of rewriting proofs so that they are substantially shorter and more declarative in structure.

Cite

Text

Ahuja et al. "ImProver: Agent-Based Automated Proof Optimization." International Conference on Learning Representations, 2025.

Markdown

[Ahuja et al. "ImProver: Agent-Based Automated Proof Optimization." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/ahuja2025iclr-improver/)

BibTeX

@inproceedings{ahuja2025iclr-improver,
  title     = {{ImProver: Agent-Based Automated Proof Optimization}},
  author    = {Ahuja, Riyaz and Avigad, Jeremy and Tetali, Prasad and Welleck, Sean},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/ahuja2025iclr-improver/}
}