Neural Symbolic Regression That Scales
Abstract
Symbolic equations are at the core of scientific discovery. The task of discovering the underlying equation from a set of input-output pairs is called symbolic regression. Traditionally, symbolic regression methods use hand-designed strategies that do not improve with experience. In this paper, we introduce the first symbolic regression method that leverages large scale pre-training. We procedurally generate an unbounded set of equations, and simultaneously pre-train a Transformer to predict the symbolic equation from a corresponding set of input-output-pairs. At test time, we query the model on a new set of points and use its output to guide the search for the equation. We show empirically that this approach can re-discover a set of well-known physical equations, and that it improves over time with more data and compute.
Cite
Text
Biggio et al. "Neural Symbolic Regression That Scales." International Conference on Machine Learning, 2021.Markdown
[Biggio et al. "Neural Symbolic Regression That Scales." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/biggio2021icml-neural/)BibTeX
@inproceedings{biggio2021icml-neural,
title = {{Neural Symbolic Regression That Scales}},
author = {Biggio, Luca and Bendinelli, Tommaso and Neitz, Alexander and Lucchi, Aurelien and Parascandolo, Giambattista},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {936-945},
volume = {139},
url = {https://mlanthology.org/icml/2021/biggio2021icml-neural/}
}