Incorporating Domain Knowledge into Neural-Guided Search via in Situ Priors and Constraints

Abstract

Many AutoML problems involve optimizing discrete objects under a black-box reward. Neural-guided search provides a flexible means of searching these combinatorial spaces using an autoregressive recurrent neural network. A major benefit of this approach is that builds up objects $\textit{sequentially}$—this provides an opportunity to incorporate domain knowledge into the search by directly modifying the logits emitted during sampling. In this work, we formalize a framework for incorporating such $\textit{in situ}$ priors and constraints into neural-guided search, and provide sufficient conditions for enforcing constraints. We integrate several priors and constraints from existing works into this framework, propose several new ones, and demonstrate their efficacy in informing the task of symbolic regression.

Cite

Text

Petersen et al. "Incorporating Domain Knowledge into Neural-Guided Search via in Situ Priors and Constraints." ICML 2021 Workshops: AutoML, 2021.

Markdown

[Petersen et al. "Incorporating Domain Knowledge into Neural-Guided Search via in Situ Priors and Constraints." ICML 2021 Workshops: AutoML, 2021.](https://mlanthology.org/icmlw/2021/petersen2021icmlw-incorporating/)

BibTeX

@inproceedings{petersen2021icmlw-incorporating,
  title     = {{Incorporating Domain Knowledge into Neural-Guided Search via in Situ Priors and Constraints}},
  author    = {Petersen, Brenden K and Santiago, Claudio and Landajuela, Mikel},
  booktitle = {ICML 2021 Workshops: AutoML},
  year      = {2021},
  url       = {https://mlanthology.org/icmlw/2021/petersen2021icmlw-incorporating/}
}