Learn to Copy from the Copying History: Correlational Copy Network for Abstractive Summarization

Haoran Li, Song Xu, Peng Yuan, Yujia Wang, Youzheng Wu, Xiaodong He, Bowen Zhou


Abstract
The copying mechanism has had considerable success in abstractive summarization, facilitating models to directly copy words from the input text to the output summary. Existing works mostly employ encoder-decoder attention, which applies copying at each time step independently of the former ones. However, this may sometimes lead to incomplete copying. In this paper, we propose a novel copying scheme named Correlational Copying Network (CoCoNet) that enhances the standard copying mechanism by keeping track of the copying history. It thereby takes advantage of prior copying distributions and, at each time step, explicitly encourages the model to copy the input word that is relevant to the previously copied one. In addition, we strengthen CoCoNet through pre-training with suitable corpora that simulate the copying behaviors. Experimental results show that CoCoNet can copy more accurately and achieves new state-of-the-art performances on summarization benchmarks, including CNN/DailyMail for news summarization and SAMSum for dialogue summarization. The code and checkpoint will be publicly available.
Anthology ID:
2021.emnlp-main.336
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4091–4101
Language:
URL:
https://aclanthology.org/2021.emnlp-main.336
DOI:
10.18653/v1/2021.emnlp-main.336
Bibkey:
Cite (ACL):
Haoran Li, Song Xu, Peng Yuan, Yujia Wang, Youzheng Wu, Xiaodong He, and Bowen Zhou. 2021. Learn to Copy from the Copying History: Correlational Copy Network for Abstractive Summarization. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 4091–4101, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Learn to Copy from the Copying History: Correlational Copy Network for Abstractive Summarization (Li et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.336.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.336.mp4
Code
 hrlinlp/coconet
Data
CNN/Daily MailSAMSum