Structured Neural Summarization
Abstract
Summarization of long sequences into a concise statement is a core problem in natural language processing, requiring non-trivial understanding of the input. Based on the promising results of graph neural networks on highly structured data, we develop a framework to extend existing sequence encoders with a graph component that can reason about long-distance relationships in weakly structured data such as text. In an extensive evaluation, we show that the resulting hybrid sequence-graph models outperform both pure sequence models as well as pure graph models on a range of summarization tasks.
Cite
Text
Fernandes et al. "Structured Neural Summarization." International Conference on Learning Representations, 2019.Markdown
[Fernandes et al. "Structured Neural Summarization." International Conference on Learning Representations, 2019.](https://mlanthology.org/iclr/2019/fernandes2019iclr-structured/)BibTeX
@inproceedings{fernandes2019iclr-structured,
title = {{Structured Neural Summarization}},
author = {Fernandes, Patrick and Allamanis, Miltiadis and Brockschmidt, Marc},
booktitle = {International Conference on Learning Representations},
year = {2019},
url = {https://mlanthology.org/iclr/2019/fernandes2019iclr-structured/}
}