Unsupervised Extractive Summarization by Pre-training Hierarchical Transformers

Shusheng Xu, Xingxing Zhang, Yi Wu, Furu Wei, Ming Zhou


Abstract
Unsupervised extractive document summarization aims to select important sentences from a document without using labeled summaries during training. Existing methods are mostly graph-based with sentences as nodes and edge weights measured by sentence similarities. In this work, we find that transformer attentions can be used to rank sentences for unsupervised extractive summarization. Specifically, we first pre-train a hierarchical transformer model using unlabeled documents only. Then we propose a method to rank sentences using sentence-level self-attentions and pre-training objectives. Experiments on CNN/DailyMail and New York Times datasets show our model achieves state-of-the-art performance on unsupervised summarization. We also find in experiments that our model is less dependent on sentence positions. When using a linear combination of our model and a recent unsupervised model explicitly modeling sentence positions, we obtain even better results.
Anthology ID:
2020.findings-emnlp.161
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1784–1795
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.161
DOI:
10.18653/v1/2020.findings-emnlp.161
Bibkey:
Cite (ACL):
Shusheng Xu, Xingxing Zhang, Yi Wu, Furu Wei, and Ming Zhou. 2020. Unsupervised Extractive Summarization by Pre-training Hierarchical Transformers. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1784–1795, Online. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Extractive Summarization by Pre-training Hierarchical Transformers (Xu et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.161.pdf
Optional supplementary material:
 2020.findings-emnlp.161.OptionalSupplementaryMaterial.pdf
Code
 xssstory/STAS
Data
New York Times Annotated Corpus