Language Generation via Combinatorial Constraint Satisfaction: A Tree Search Enhanced Monte-Carlo Approach
Abstract
Generating natural language under complex constraints is a principal formulation towards controllable text generation. We present a framework to allow the specification of combinatorial constraints for sentence generation. We propose TSMH, an efficient method to generate high likelihood sentences with respect to a pre-trained language model while satisfying the constraints. Our approach is highly flexible, requires no task-specific training, and leverages efficient constraint satisfaction solving techniques. To better handle the combinatorial constraints, a tree search algorithm is embedded into the proposal process of the Markov chain Monte Carlo (MCMC) to explore candidates that satisfy more constraints. Compared to existing MCMC approaches, our sampling approach has a better mixing performance. Experiments show that TSMH achieves consistent and significant improvement on multiple language generation tasks.
Cite
Text
Zhang et al. "Language Generation via Combinatorial Constraint Satisfaction: A Tree Search Enhanced Monte-Carlo Approach." NeurIPS 2020 Workshops: LMCA, 2020.Markdown
[Zhang et al. "Language Generation via Combinatorial Constraint Satisfaction: A Tree Search Enhanced Monte-Carlo Approach." NeurIPS 2020 Workshops: LMCA, 2020.](https://mlanthology.org/neuripsw/2020/zhang2020neuripsw-language/)BibTeX
@inproceedings{zhang2020neuripsw-language,
title = {{Language Generation via Combinatorial Constraint Satisfaction: A Tree Search Enhanced Monte-Carlo Approach}},
author = {Zhang, Maosen and Jiang, Nan and Li, Lei and Xue, Yexiang},
booktitle = {NeurIPS 2020 Workshops: LMCA},
year = {2020},
url = {https://mlanthology.org/neuripsw/2020/zhang2020neuripsw-language/}
}