Guiding Generation for Abstractive Text Summarization Based on Key Information Guide Network

Chenliang Li, Weiran Xu, Si Li, Sheng Gao


Abstract
Neural network models, based on the attentional encoder-decoder model, have good capability in abstractive text summarization. However, these models are hard to be controlled in the process of generation, which leads to a lack of key information. We propose a guiding generation model that combines the extractive method and the abstractive method. Firstly, we obtain keywords from the text by a extractive model. Then, we introduce a Key Information Guide Network (KIGN), which encodes the keywords to the key information representation, to guide the process of generation. In addition, we use a prediction-guide mechanism, which can obtain the long-term value for future decoding, to further guide the summary generation. We evaluate our model on the CNN/Daily Mail dataset. The experimental results show that our model leads to significant improvements.
Anthology ID:
N18-2009
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
55–60
Language:
URL:
https://aclanthology.org/N18-2009
DOI:
10.18653/v1/N18-2009
Bibkey:
Cite (ACL):
Chenliang Li, Weiran Xu, Si Li, and Sheng Gao. 2018. Guiding Generation for Abstractive Text Summarization Based on Key Information Guide Network. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 55–60, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Guiding Generation for Abstractive Text Summarization Based on Key Information Guide Network (Li et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-2009.pdf
Data
CNN/Daily Mail