Effective Slot Filling via Weakly-Supervised Dual-Model Learning

Abstract

Slot filling is a challenging task in Spoken Language Understanding (SLU). Supervised methods usually require large amounts of annotation to maintain desirable performance. A solution to relieve the heavy dependency on labeled data is to employ bootstrapping, which leverages unlabeled data. However, bootstrapping is known to suffer from semantic drift. We argue that semantic drift can be tackled by exploiting the correlation between slot values (phrases) and their respective types. By using some particular weakly labeled data, namely the plain phrases included in sentences, we propose a weakly-supervised slot filling approach. Our approach trains two models, namely a classifier and a tagger, which can effectively learn from each other on the weakly labeled data. The experimental results demonstrate that our approach achieves better results than standard baselines on multiple datasets, especially in the low-resource setting.

Cite

Text

Wang et al. "Effective Slot Filling via Weakly-Supervised Dual-Model Learning." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I16.17643

Markdown

[Wang et al. "Effective Slot Filling via Weakly-Supervised Dual-Model Learning." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/wang2021aaai-effective/) doi:10.1609/AAAI.V35I16.17643

BibTeX

@inproceedings{wang2021aaai-effective,
  title     = {{Effective Slot Filling via Weakly-Supervised Dual-Model Learning}},
  author    = {Wang, Jue and Chen, Ke and Shou, Lidan and Wu, Sai and Chen, Gang},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {13952-13960},
  doi       = {10.1609/AAAI.V35I16.17643},
  url       = {https://mlanthology.org/aaai/2021/wang2021aaai-effective/}
}