Abstractive Summarizers are Excellent Extractive Summarizers

Daniel Varab, Yumo Xu


Abstract
Extractive and abstractive summarization designs have historically been fragmented, limiting the benefits that often arise from compatible model architectures. In this paper, we explore the potential synergies of modeling extractive summarization with an abstractive summarization system and propose three novel inference algorithms using the sequence-to-sequence architecture. We evaluate them on the CNN & Dailymail dataset and show that recent advancements in abstractive system designs enable abstractive systems to not only compete, but even surpass the performance of extractive systems with custom architectures. To our surprise, abstractive systems achieve this without being exposed to extractive oracle summaries and, therefore, for the first time allow a single model to produce both abstractive and extractive summaries. This evidence questions our fundamental understanding of extractive system design, and the necessity for extractive labels while pathing the way for promising research directions in hybrid models.
Anthology ID:
2023.acl-short.29
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
330–339
Language:
URL:
https://aclanthology.org/2023.acl-short.29
DOI:
10.18653/v1/2023.acl-short.29
Bibkey:
Cite (ACL):
Daniel Varab and Yumo Xu. 2023. Abstractive Summarizers are Excellent Extractive Summarizers. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 330–339, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Abstractive Summarizers are Excellent Extractive Summarizers (Varab & Xu, ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.29.pdf
Video:
 https://aclanthology.org/2023.acl-short.29.mp4