Extractive Summarization with Text Generator

Thang Le, Anh Tuan Luu


Abstract
Standard extractive systems suffer from the lack of gold training signals since existing corpora solely provide document and human-written summary pairs while disregarding extractive labels. As a result, existing methods resort to imperfect pseudo-labels that are both biased and error-prone, thereby hindering the learning process of extractive models. In contrast, text generators which are commonly employed in abstractive summarization can effortlessly overcome this predicament on account of flexible sequence-to-sequence architectures. Motivated to bypass this inherent limitation, we investigate the possibility of conducting extractive summarization with text generators. Through extensive experiments covering six summarization benchmarks, we show that high-quality extractive summaries can be assembled via approximating the outputs (abstractive summaries) of these generators. Moreover, we find that the approximate summaries correlate positively with the auxiliary summaries (i.e. a better generator enables the production of better extractive summaries). Our results signify a new paradigm for training extractive summarizers i.e. learning with generation (abstractive) objectives rather than extractive schemes.
Anthology ID:
2024.naacl-long.9
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
157–174
Language:
URL:
https://aclanthology.org/2024.naacl-long.9
DOI:
Bibkey:
Cite (ACL):
Thang Le and Anh Tuan Luu. 2024. Extractive Summarization with Text Generator. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 157–174, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Extractive Summarization with Text Generator (Le & Luu, NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.9.pdf
Copyright:
 2024.naacl-long.9.copyright.pdf