Improving Neural Abstractive Document Summarization with Explicit Information Selection Modeling

Wei Li, Xinyan Xiao, Yajuan Lyu, Yuanzhuo Wang


Abstract
Information selection is the most important component in document summarization task. In this paper, we propose to extend the basic neural encoding-decoding framework with an information selection layer to explicitly model and optimize the information selection process in abstractive document summarization. Specifically, our information selection layer consists of two parts: gated global information filtering and local sentence selection. Unnecessary information in the original document is first globally filtered, then salient sentences are selected locally while generating each summary sentence sequentially. To optimize the information selection process directly, distantly-supervised training guided by the golden summary is also imported. Experimental results demonstrate that the explicit modeling and optimizing of the information selection process improves document summarization performance significantly, which enables our model to generate more informative and concise summaries, and thus significantly outperform state-of-the-art neural abstractive methods.
Anthology ID:
D18-1205
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1787–1796
Language:
URL:
https://aclanthology.org/D18-1205
DOI:
10.18653/v1/D18-1205
Bibkey:
Cite (ACL):
Wei Li, Xinyan Xiao, Yajuan Lyu, and Yuanzhuo Wang. 2018. Improving Neural Abstractive Document Summarization with Explicit Information Selection Modeling. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 1787–1796, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Improving Neural Abstractive Document Summarization with Explicit Information Selection Modeling (Li et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1205.pdf
Attachment:
 D18-1205.Attachment.pdf
Video:
 https://vimeo.com/305885506
Data
CNN/Daily Mail