Document Summarization with VHTM: Variational Hierarchical Topic-Aware Mechanism

Abstract

Automatic text summarization focuses on distilling summary information from texts. This research field has been considerably explored over the past decades because of its significant role in many natural language processing tasks; however, two challenging issues block its further development: (1) how to yield a summarization model embedding topic inference rather than extending with a pre-trained one and (2) how to merge the latent topics into diverse granularity levels. In this study, we propose a variational hierarchical model to holistically address both issues, dubbed VHTM. Different from the previous work assisted by a pre-trained single-grained topic model, VHTM is the first attempt to jointly accomplish summarization with topic inference via variational encoder-decoder and merge topics into multi-grained levels through topic embedding and attention. Comprehensive experiments validate the superior performance of VHTM compared with the baselines, accompanying with semantically consistent topics.

Cite

Text

Fu et al. "Document Summarization with VHTM: Variational Hierarchical Topic-Aware Mechanism." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I05.6277

Markdown

[Fu et al. "Document Summarization with VHTM: Variational Hierarchical Topic-Aware Mechanism." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/fu2020aaai-document/) doi:10.1609/AAAI.V34I05.6277

BibTeX

@inproceedings{fu2020aaai-document,
  title     = {{Document Summarization with VHTM: Variational Hierarchical Topic-Aware Mechanism}},
  author    = {Fu, Xiyan and Wang, Jun and Zhang, Jinghan and Wei, Jinmao and Yang, Zhenglu},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {7740-7747},
  doi       = {10.1609/AAAI.V34I05.6277},
  url       = {https://mlanthology.org/aaai/2020/fu2020aaai-document/}
}