Graph-Based Transformer with Cross-Candidate Verification for Semantic Parsing

Abstract

In this paper, we present a graph-based Transformer for semantic parsing. We separate the semantic parsing task into two steps: 1) Use a sequence-to-sequence model to generate the logical form candidates. 2) Design a graph-based Transformer to rerank the candidates. To handle the structure of logical forms, we incorporate graph information to Transformer, and design a cross-candidate verification mechanism to consider all the candidates in the ranking process. Furthermore, we integrate BERT into our model and jointly train the graph-based Transformer and BERT. We conduct experiments on 3 semantic parsing benchmarks, ATIS, JOBS and Task Oriented semantic Parsing dataset (TOP). Experiments show that our graph-based reranking model achieves results comparable to state-of-the-art models on the ATIS and JOBS datasets. And on the TOP dataset, our model achieves a new state-of-the-art result.

Cite

Text

Shao et al. "Graph-Based Transformer with Cross-Candidate Verification for Semantic Parsing." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I05.6408

Markdown

[Shao et al. "Graph-Based Transformer with Cross-Candidate Verification for Semantic Parsing." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/shao2020aaai-graph/) doi:10.1609/AAAI.V34I05.6408

BibTeX

@inproceedings{shao2020aaai-graph,
  title     = {{Graph-Based Transformer with Cross-Candidate Verification for Semantic Parsing}},
  author    = {Shao, Bo and Gong, Yeyun and Qi, Weizhen and Cao, Guihong and Ji, Jianshu and Lin, Xiaola},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {8807-8814},
  doi       = {10.1609/AAAI.V34I05.6408},
  url       = {https://mlanthology.org/aaai/2020/shao2020aaai-graph/}
}