Switch-GPT: An Effective Method for Constrained Text Generation Under Few-Shot Settings (Student Abstract)

Abstract

In real-world applications of natural language generation, target sentences are often required to satisfy some lexical constraints. However, the success of most neural-based models relies heavily on data, which is infeasible for data-scarce new domains. In this work, we present FewShotAmazon, the first benchmark for the task of Constrained Text Generation under few-shot settings on multiple domains. Further, we propose the Switch-GPT model, in which we utilize the strong language modeling capacity of GPT-2 to generate fluent and well-formulated sentences, while using a light attention module to decide which constraint to attend to at each step. Experiments show that the proposed Switch-GPT model is effective and remarkably outperforms the baselines. Codes will be available at https://github.com/chang-github-00/Switch-GPT.

Cite

Text

Ma et al. "Switch-GPT: An Effective Method for Constrained Text Generation Under Few-Shot Settings (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I11.21642

Markdown

[Ma et al. "Switch-GPT: An Effective Method for Constrained Text Generation Under Few-Shot Settings (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/ma2022aaai-switch/) doi:10.1609/AAAI.V36I11.21642

BibTeX

@inproceedings{ma2022aaai-switch,
  title     = {{Switch-GPT: An Effective Method for Constrained Text Generation Under Few-Shot Settings (Student Abstract)}},
  author    = {Ma, Chang and Zhang, Song and Shen, Gehui and Deng, Zhi-Hong},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2022},
  pages     = {13011-13012},
  doi       = {10.1609/AAAI.V36I11.21642},
  url       = {https://mlanthology.org/aaai/2022/ma2022aaai-switch/}
}