Does Pretraining for Summarization Require Knowledge Transfer?

Kundan Krishna, Jeffrey Bigham, Zachary C. Lipton


Abstract
Pretraining techniques leveraging enormous datasets have driven recent advances in text summarization. While folk explanations suggest that knowledge transfer accounts for pretraining’s benefits, little is known about why it works or what makes a pretraining task or dataset suitable. In this paper, we challenge the knowledge transfer story, showing that pretraining on documents consisting of character n-grams selected at random, we can nearly match the performance of models pretrained on real corpora. This work holds the promise of eliminating upstream corpora, which may alleviate some concerns over offensive language, bias, and copyright issues. To see whether the small residual benefit of using real data could be accounted for by the structure of the pretraining task, we design several tasks motivated by a qualitative study of summarization corpora. However, these tasks confer no appreciable benefit, leaving open the possibility of a small role for knowledge transfer.
Anthology ID:
2021.findings-emnlp.273
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3178–3189
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.273
DOI:
10.18653/v1/2021.findings-emnlp.273
Bibkey:
Cite (ACL):
Kundan Krishna, Jeffrey Bigham, and Zachary C. Lipton. 2021. Does Pretraining for Summarization Require Knowledge Transfer?. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 3178–3189, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Does Pretraining for Summarization Require Knowledge Transfer? (Krishna et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.273.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.273.mp4
Data
RotoWire