MemSum: Extractive Summarization of Long Documents Using Multi-Step Episodic Markov Decision Processes

Nianlong Gu, Elliott Ash, Richard Hahnloser


Abstract
We introduce MemSum (Multi-step Episodic Markov decision process extractive SUMmarizer), a reinforcement-learning-based extractive summarizer enriched at each step with information on the current extraction history. When MemSum iteratively selects sentences into the summary, it considers a broad information set that would intuitively also be used by humans in this task: 1) the text content of the sentence, 2) the global text context of the rest of the document, and 3) the extraction history consisting of the set of sentences that have already been extracted. With a lightweight architecture, MemSum obtains state-of-the-art test-set performance (ROUGE) in summarizing long documents taken from PubMed, arXiv, and GovReport. Ablation studies demonstrate the importance of local, global, and history information. A human evaluation confirms the high quality and low redundancy of the generated summaries, stemming from MemSum’s awareness of extraction history.
Anthology ID:
2022.acl-long.450
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6507–6522
Language:
URL:
https://aclanthology.org/2022.acl-long.450
DOI:
10.18653/v1/2022.acl-long.450
Bibkey:
Cite (ACL):
Nianlong Gu, Elliott Ash, and Richard Hahnloser. 2022. MemSum: Extractive Summarization of Long Documents Using Multi-Step Episodic Markov Decision Processes. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6507–6522, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
MemSum: Extractive Summarization of Long Documents Using Multi-Step Episodic Markov Decision Processes (Gu et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.450.pdf
Software:
 2022.acl-long.450.software.zip
Code
 nianlonggu/memsum
Data
Arxiv HEP-TH citation graphGovReportPubmed