Controllable Text Generation with Neurally-Decomposed Oracle
Abstract
We propose a general and efficient framework to control auto-regressive generation models with NeurAlly-Decomposed Oracle (NADO). Given a pre-trained base language model and a sequence-level boolean oracle function, we aim to decompose the oracle function into token-level guidance to steer the base model in text generation. Specifically, the token-level guidance is provided by NADO, a neural model trained with examples sampled from the base model, demanding no additional auxiliary labeled data. Based on posterior regularization, we present the close-form optimal solution to incorporate the decomposed token-level guidance into the base model for controllable generation. We further discuss how the neural approximation affects the quality of the solution. These experiments conducted on two different applications: (1) text generation with lexical constraints and (2) machine translation with formality control demonstrate that our framework efficiently guides the base model towards the given oracle while keeping high generation quality.
Cite
Text
Meng et al. "Controllable Text Generation with Neurally-Decomposed Oracle." Neural Information Processing Systems, 2022.Markdown
[Meng et al. "Controllable Text Generation with Neurally-Decomposed Oracle." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/meng2022neurips-controllable/)BibTeX
@inproceedings{meng2022neurips-controllable,
title = {{Controllable Text Generation with Neurally-Decomposed Oracle}},
author = {Meng, Tao and Lu, Sidi and Peng, Nanyun and Chang, Kai-Wei},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/meng2022neurips-controllable/}
}