Enriching and Controlling Global Semantics for Text Summarization

Thong Nguyen, Anh Tuan Luu, Truc Lu, Tho Quan


Abstract
Recently, Transformer-based models have been proven effective in the abstractive summarization task by creating fluent and informative summaries. Nevertheless, these models still suffer from the short-range dependency problem, causing them to produce summaries that miss the key points of document. In this paper, we attempt to address this issue by introducing a neural topic model empowered with normalizing flow to capture the global semantics of the document, which are then integrated into the summarization model. In addition, to avoid the overwhelming effect of global semantics on contextualized representation, we introduce a mechanism to control the amount of global semantics supplied to the text generation module. Our method outperforms state-of-the-art summarization models on five common text summarization datasets, namely CNN/DailyMail, XSum, Reddit TIFU, arXiv, and PubMed.
Anthology ID:
2021.emnlp-main.744
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9443–9456
Language:
URL:
https://aclanthology.org/2021.emnlp-main.744
DOI:
10.18653/v1/2021.emnlp-main.744
Bibkey:
Cite (ACL):
Thong Nguyen, Anh Tuan Luu, Truc Lu, and Tho Quan. 2021. Enriching and Controlling Global Semantics for Text Summarization. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9443–9456, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Enriching and Controlling Global Semantics for Text Summarization (Nguyen et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.744.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.744.mp4
Data
Reddit TIFU