Abstractive Multi-Document Summarization via Joint Learning with Single-Document Summarization

Hanqi Jin, Xiaojun Wan


Abstract
Single-document and multi-document summarizations are very closely related in both task definition and solution method. In this work, we propose to improve neural abstractive multi-document summarization by jointly learning an abstractive single-document summarizer. We build a unified model for single-document and multi-document summarizations by fully sharing the encoder and decoder and utilizing a decoding controller to aggregate the decoder’s outputs for multiple input documents. We evaluate our model on two multi-document summarization datasets: Multi-News and DUC-04. Experimental results show the efficacy of our approach, and it can substantially outperform several strong baselines. We also verify the helpfulness of single-document summarization to abstractive multi-document summarization task.
Anthology ID:
2020.findings-emnlp.231
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2545–2554
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.231
DOI:
10.18653/v1/2020.findings-emnlp.231
Bibkey:
Cite (ACL):
Hanqi Jin and Xiaojun Wan. 2020. Abstractive Multi-Document Summarization via Joint Learning with Single-Document Summarization. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 2545–2554, Online. Association for Computational Linguistics.
Cite (Informal):
Abstractive Multi-Document Summarization via Joint Learning with Single-Document Summarization (Jin & Wan, Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.231.pdf
Code
 zhongxia96/mds-and-sds
Data
Multi-News