Fact-level Extractive Summarization with Hierarchical Graph Mask on BERT

Ruifeng Yuan, Zili Wang, Wenjie Li


Abstract
Most current extractive summarization models generate summaries by selecting salient sentences. However, one of the problems with sentence-level extractive summarization is that there exists a gap between the human-written gold summary and the oracle sentence labels. In this paper, we propose to extract fact-level semantic units for better extractive summarization. We also introduce a hierarchical structure, which incorporates the multi-level of granularities of the textual information into the model. In addition, we incorporate our model with BERT using a hierarchical graph mask. This allows us to combine BERT’s ability in natural language understanding and the structural information without increasing the scale of the model. Experiments on the CNN/DaliyMail dataset show that our model achieves state-of-the-art results.
Anthology ID:
2020.coling-main.493
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5629–5639
Language:
URL:
https://aclanthology.org/2020.coling-main.493
DOI:
10.18653/v1/2020.coling-main.493
Bibkey:
Cite (ACL):
Ruifeng Yuan, Zili Wang, and Wenjie Li. 2020. Fact-level Extractive Summarization with Hierarchical Graph Mask on BERT. In Proceedings of the 28th International Conference on Computational Linguistics, pages 5629–5639, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Fact-level Extractive Summarization with Hierarchical Graph Mask on BERT (Yuan et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.493.pdf
Code
 Ruifeng-paper/FactExsum-coling2020