Self-Improving Language Models for Evolutionary Program Synthesis: A Case Study on ARC-AGI
Abstract
Many program synthesis tasks prove too challenging for even state-of-the-art language models to solve in single attempts. Search-based evolutionary methods offer a promising alternative by exploring solution spaces iteratively, but their effectiveness remain limited by the fixed capabilities of the underlying generative model. We propose SOAR, a method that learns program synthesis by integrating language models into a self-improving evolutionary loop. SOAR alternates between (1) an evolutionary search that uses an LLM to sample and refine candidate solutions, and (2) a hindsight learning phase that converts search attempts into valid problem-solution pairs used to fine-tune the LLM’s sampling and refinement capabilities—enabling increasingly effective search in subsequent iterations. On the challenging ARC-AGI benchmark, SOAR achieves significant performance gains across model scales and iterations, leveraging positive transfer between the sampling and refinement finetuning tasks. These improvements carry over to test-time adaptation, enabling SOAR to solve 52% of the public test set.
Cite
Text
Pourcel et al. "Self-Improving Language Models for Evolutionary Program Synthesis: A Case Study on ARC-AGI." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Pourcel et al. "Self-Improving Language Models for Evolutionary Program Synthesis: A Case Study on ARC-AGI." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/pourcel2025icml-selfimproving/)BibTeX
@inproceedings{pourcel2025icml-selfimproving,
title = {{Self-Improving Language Models for Evolutionary Program Synthesis: A Case Study on ARC-AGI}},
author = {Pourcel, Julien and Colas, Cédric and Oudeyer, Pierre-Yves},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {49659-49688},
volume = {267},
url = {https://mlanthology.org/icml/2025/pourcel2025icml-selfimproving/}
}