Sparsity and Sentence Structure in Encoder-Decoder Attention of Summarization Systems

Potsawee Manakul, Mark Gales


Abstract
Transformer models have achieved state-of-the-art results in a wide range of NLP tasks including summarization. Training and inference using large transformer models can be computationally expensive. Previous work has focused on one important bottleneck, the quadratic self-attention mechanism in the encoder. Modified encoder architectures such as LED or LoBART use local attention patterns to address this problem for summarization. In contrast, this work focuses on the transformer’s encoder-decoder attention mechanism. The cost of this attention becomes more significant in inference or training approaches that require model-generated histories. First, we examine the complexity of the encoder-decoder attention. We demonstrate empirically that there is a sparse sentence structure in document summarization that can be exploited by constraining the attention mechanism to a subset of input sentences, whilst maintaining system performance. Second, we propose a modified architecture that selects the subset of sentences to constrain the encoder-decoder attention. Experiments are carried out on abstractive summarization tasks, including CNN/DailyMail, XSum, Spotify Podcast, and arXiv.
Anthology ID:
2021.emnlp-main.739
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9359–9368
Language:
URL:
https://aclanthology.org/2021.emnlp-main.739
DOI:
10.18653/v1/2021.emnlp-main.739
Bibkey:
Cite (ACL):
Potsawee Manakul and Mark Gales. 2021. Sparsity and Sentence Structure in Encoder-Decoder Attention of Summarization Systems. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9359–9368, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Sparsity and Sentence Structure in Encoder-Decoder Attention of Summarization Systems (Manakul & Gales, EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.739.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.739.mp4