LambdaBeam: Neural Program Search with Higher-Order Functions and Lambdas
Abstract
Search is an important technique in program synthesis that allows for adaptive strategies such as focusing on particular search directions based on execution results. Several prior works have demonstrated that neural models are effective at guiding program synthesis searches. However, a common drawback of those approaches is the inability to handle iterative loops, higher-order functions, or lambda functions, thus limiting prior neural searches from synthesizing longer and more general programs. We address this gap by designing a search algorithm called LambdaBeam that can construct arbitrary lambda functions that compose operations within a given DSL. We create semantic vector representations of the execution behavior of the lambda functions and train a neural policy network to choose which lambdas to construct during search, and pass them as arguments to higher-order functions to perform looping computations. Our experiments show that LambdaBeam outperforms neural, symbolic, and LLM-based techniques in an integer list manipulation domain.
Cite
Text
Shi et al. "LambdaBeam: Neural Program Search with Higher-Order Functions and Lambdas." Neural Information Processing Systems, 2023.Markdown
[Shi et al. "LambdaBeam: Neural Program Search with Higher-Order Functions and Lambdas." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/shi2023neurips-lambdabeam/)BibTeX
@inproceedings{shi2023neurips-lambdabeam,
title = {{LambdaBeam: Neural Program Search with Higher-Order Functions and Lambdas}},
author = {Shi, Kensen and Dai, Hanjun and Li, Wen-Ding and Ellis, Kevin and Sutton, Charles A.},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/shi2023neurips-lambdabeam/}
}