Scaling Neural Program Synthesis with Distribution-Based Search

Abstract

We consider the problem of automatically constructing computer programs from input-output examples. We investigate how to augment probabilistic and neural program synthesis methods with new search algorithms, proposing a framework called distribution-based search. Within this framework, we introduce two new search algorithms: Heap Search, an enumerative method, and SQRT Sampling, a probabilistic method. We prove certain optimality guarantees for both methods, show how they integrate with probabilistic and neural techniques, and demonstrate how they can operate at scale across parallel compute environments. Collectively these findings offer theoretical and applied studies of search algorithms for program synthesis that integrate with recent developments in machine-learned program synthesizers.

Cite

Text

Fijalkow et al. "Scaling Neural Program Synthesis with Distribution-Based Search." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I6.20616

Markdown

[Fijalkow et al. "Scaling Neural Program Synthesis with Distribution-Based Search." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/fijalkow2022aaai-scaling/) doi:10.1609/AAAI.V36I6.20616

BibTeX

@inproceedings{fijalkow2022aaai-scaling,
  title     = {{Scaling Neural Program Synthesis with Distribution-Based Search}},
  author    = {Fijalkow, Nathanaël and Lagarde, Guillaume and Matricon, Théo and Ellis, Kevin and Ohlmann, Pierre and Potta, Akarsh Nayan},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2022},
  pages     = {6623-6630},
  doi       = {10.1609/AAAI.V36I6.20616},
  url       = {https://mlanthology.org/aaai/2022/fijalkow2022aaai-scaling/}
}