Global-Aware Beam Search for Neural Abstractive Summarization
Abstract
This study develops a calibrated beam-based algorithm with awareness of the global attention distribution for neural abstractive summarization, aiming to improve the local optimality problem of the original beam search in a rigorous way. Specifically, a novel global protocol is proposed based on the attention distribution to stipulate how a global optimal hypothesis should attend to the source. A global scoring mechanism is then developed to regulate beam search to generate summaries in a near-global optimal fashion. This novel design enjoys a distinctive property, i.e., the global attention distribution could be predicted before inference, enabling step-wise improvements on the beam search through the global scoring mechanism. Extensive experiments on nine datasets show that the global (attention)-aware inference significantly improves state-of-the-art summarization models even using empirical hyper-parameters. The algorithm is also proven robust as it remains to generate meaningful texts with corrupted attention distributions. The codes and a comprehensive set of examples are available.
Cite
Text
Ma et al. "Global-Aware Beam Search for Neural Abstractive Summarization." Neural Information Processing Systems, 2021.Markdown
[Ma et al. "Global-Aware Beam Search for Neural Abstractive Summarization." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/ma2021neurips-globalaware/)BibTeX
@inproceedings{ma2021neurips-globalaware,
title = {{Global-Aware Beam Search for Neural Abstractive Summarization}},
author = {Ma, Ye and Lan, Zixun and Zong, Lu and Huang, Kaizhu},
booktitle = {Neural Information Processing Systems},
year = {2021},
url = {https://mlanthology.org/neurips/2021/ma2021neurips-globalaware/}
}