Zero-Shot Slot Filling with Slot-Prefix Prompting and Attention Relationship Descriptor

Abstract

This paper addresses zero-shot slot filling, which tries to build a system that can generalize to unseen slot types without any training data. The key to zero-shot slot-filling is to match the tokens from the utterance with the semantic definition of the slot without training data in the target domain. This paper tackles this problem by devising a scheme to fully leverage pre-trained language models (PLMs). To this end, we propose a new prompting scheme that utilizes both learnable tokens and slot names to guide the model to focus on the relevant text spans for a given slot. Furthermore, we use attention values between tokens to form a feature descriptor for each token, which is motivated by the fact that the attention value in a PLM naturally characterizes various relationships, e.g., syntactic or semantic, between tokens. By further consolidating those features with an additional transformer-based aggregation module, we create a simple-but-effective zero-shot slot filling system that can achieve significantly better performance than the previous methods, as demonstrated by our experimental studies.

Cite

Text

Luo and Liu. "Zero-Shot Slot Filling with Slot-Prefix Prompting and Attention Relationship Descriptor." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I11.26566

Markdown

[Luo and Liu. "Zero-Shot Slot Filling with Slot-Prefix Prompting and Attention Relationship Descriptor." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/luo2023aaai-zero/) doi:10.1609/AAAI.V37I11.26566

BibTeX

@inproceedings{luo2023aaai-zero,
  title     = {{Zero-Shot Slot Filling with Slot-Prefix Prompting and Attention Relationship Descriptor}},
  author    = {Luo, Qiaoyang and Liu, Lingqiao},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {13344-13352},
  doi       = {10.1609/AAAI.V37I11.26566},
  url       = {https://mlanthology.org/aaai/2023/luo2023aaai-zero/}
}