HiStruct+: Improving Extractive Text Summarization with Hierarchical Structure Information

Qian Ruan, Malte Ostendorff, Georg Rehm


Abstract
Transformer-based language models usually treat texts as linear sequences. However, most texts also have an inherent hierarchical structure, i.e., parts of a text can be identified using their position in this hierarchy. In addition, section titles usually indicate the common topic of their respective sentences. We propose a novel approach to formulate, extract, encode and inject hierarchical structure information explicitly into an extractive summarization model based on a pre-trained, encoder-only Transformer language model (HiStruct+ model), which improves SOTA ROUGEs for extractive summarization on PubMed and arXiv substantially. Using various experimental settings on three datasets (i.e., CNN/DailyMail, PubMed and arXiv), our HiStruct+ model outperforms a strong baseline collectively, which differs from our model only in that the hierarchical structure information is not injected. It is also observed that the more conspicuous hierarchical structure the dataset has, the larger improvements our method gains. The ablation study demonstrates that the hierarchical position information is the main contributor to our model’s SOTA performance.
Anthology ID:
2022.findings-acl.102
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1292–1308
Language:
URL:
https://aclanthology.org/2022.findings-acl.102
DOI:
10.18653/v1/2022.findings-acl.102
Bibkey:
Cite (ACL):
Qian Ruan, Malte Ostendorff, and Georg Rehm. 2022. HiStruct+: Improving Extractive Text Summarization with Hierarchical Structure Information. In Findings of the Association for Computational Linguistics: ACL 2022, pages 1292–1308, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
HiStruct+: Improving Extractive Text Summarization with Hierarchical Structure Information (Ruan et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.102.pdf
Software:
 2022.findings-acl.102.software.zip
Data
PubmedarXiv