RST-LoRA: A Discourse-Aware Low-Rank Adaptation for Long Document Abstractive Summarization

Dongqi Pu, Vera Demberg


Abstract
For long document summarization, discourse structure is important to discern the key content of the text and the differences in importance level between sentences. Unfortunately, the integration of rhetorical structure theory (RST) into parameter-efficient fine-tuning strategies for long document summarization remains unexplored. Therefore, this paper introduces RST-LoRA and proposes four RST-aware variants to explicitly incorporate RST into the LoRA model. Our empirical evaluation demonstrates that incorporating the type and uncertainty of rhetorical relations can complementarily enhance the performance of LoRA in summarization tasks. Furthermore, the best-performing variant we introduced outperforms the vanilla LoRA and full-parameter fine-tuning models, as confirmed by multiple automatic and human evaluations, and even surpasses previous state-of-the-art methods.
Anthology ID:
2024.naacl-long.121
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2200–2220
Language:
URL:
https://aclanthology.org/2024.naacl-long.121
DOI:
Bibkey:
Cite (ACL):
Dongqi Pu and Vera Demberg. 2024. RST-LoRA: A Discourse-Aware Low-Rank Adaptation for Long Document Abstractive Summarization. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 2200–2220, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
RST-LoRA: A Discourse-Aware Low-Rank Adaptation for Long Document Abstractive Summarization (Pu & Demberg, NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.121.pdf
Copyright:
 2024.naacl-long.121.copyright.pdf