PELMS: Pre-training for Effective Low-Shot Multi-Document Summarization

Joseph Peper, Wenzhao Qiu, Lu Wang


Abstract
We investigate pre-training techniques for abstractive multi-document summarization (MDS), which is much less studied than summarizing single documents. Though recent work has demonstrated the effectiveness of highlighting information salience for pre-training strategy design, they struggle to generate abstractive and reflective summaries, which are critical properties for MDS. To this end, we present **PELMS**, a pre-trained model that uses pre-training objectives based on semantic coherence heuristics and faithfulness constraints together with unlabeled multi-document inputs, to promote the generation of concise, fluent, and faithful summaries. To support the training of PELMS, we compile **MultiPT**, a multi-document pre-training corpus containing over 93 million documents to form more than 3million unlabeled topic-centric document clusters, covering diverse genres such as product reviews, news, and general knowledge. We perform extensive evaluation of PELMS in low-shot settings on a wide range of MDS datasets. Our approach consistently outperforms competitive comparisons with respect to overall informativeness, abstractiveness, coherence, and faithfulness, and with minimal fine-tuning can match performance of language models at a much larger scale (e.g., GPT-4).
Anthology ID:
2024.naacl-long.423
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7645–7667
Language:
URL:
https://aclanthology.org/2024.naacl-long.423
DOI:
Bibkey:
Cite (ACL):
Joseph Peper, Wenzhao Qiu, and Lu Wang. 2024. PELMS: Pre-training for Effective Low-Shot Multi-Document Summarization. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 7645–7667, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
PELMS: Pre-training for Effective Low-Shot Multi-Document Summarization (Peper et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.423.pdf
Copyright:
 2024.naacl-long.423.copyright.pdf