Infobox-to-Text Generation with Tree-like Planning Based Attention Network

Abstract

We study the problem of infobox-to-text generation that aims to generate a textual description from a key-value table. Representing the input infobox as a sequence, previous neural methods using end-to-end models without order-planning suffer from the problems of incoherence and inadaptability to disordered input. Recent planning-based models only implement static order-planning to guide the generation, which may cause error propagation between planning and generation. To address these issues, we propose a Tree-like PLanning based Attention Network (Tree-PLAN) which leverages both static order-planning and dynamic tuning to guide the generation. A novel tree-like tuning encoder is designed to dynamically tune the static order-plan for better planning by merging the most relevant attributes together layer by layer. Experiments conducted on two datasets show that our model outperforms previous methods on both automatic and human evaluation, and demonstrate that our model has better adaptability to disordered input.

Cite

Text

Bai et al. "Infobox-to-Text Generation with Tree-like Planning Based Attention Network." International Joint Conference on Artificial Intelligence, 2020. doi:10.24963/IJCAI.2020/522

Markdown

[Bai et al. "Infobox-to-Text Generation with Tree-like Planning Based Attention Network." International Joint Conference on Artificial Intelligence, 2020.](https://mlanthology.org/ijcai/2020/bai2020ijcai-infobox/) doi:10.24963/IJCAI.2020/522

BibTeX

@inproceedings{bai2020ijcai-infobox,
  title     = {{Infobox-to-Text Generation with Tree-like Planning Based Attention Network}},
  author    = {Bai, Yang and Li, Ziran and Ding, Ning and Shen, Ying and Zheng, Hai-Tao},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {3773-3779},
  doi       = {10.24963/IJCAI.2020/522},
  url       = {https://mlanthology.org/ijcai/2020/bai2020ijcai-infobox/}
}