Global Encoding for Abstractive Summarization

Junyang Lin, Xu Sun, Shuming Ma, Qi Su


Abstract
In neural abstractive summarization, the conventional sequence-to-sequence (seq2seq) model often suffers from repetition and semantic irrelevance. To tackle the problem, we propose a global encoding framework, which controls the information flow from the encoder to the decoder based on the global information of the source context. It consists of a convolutional gated unit to perform global encoding to improve the representations of the source-side information. Evaluations on the LCSTS and the English Gigaword both demonstrate that our model outperforms the baseline models, and the analysis shows that our model is capable of generating summary of higher quality and reducing repetition.
Anthology ID:
P18-2027
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
163–169
Language:
URL:
https://aclanthology.org/P18-2027
DOI:
10.18653/v1/P18-2027
Bibkey:
Cite (ACL):
Junyang Lin, Xu Sun, Shuming Ma, and Qi Su. 2018. Global Encoding for Abstractive Summarization. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 163–169, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Global Encoding for Abstractive Summarization (Lin et al., ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/P18-2027.pdf
Poster:
 P18-2027.Poster.pdf
Code
 lancopku/Global-Encoding +  additional community code
Data
LCSTS