Balancing Lexical and Semantic Quality in Abstractive Summarization

Jeewoo Sul, Yong Suk Choi


Abstract
An important problem of the sequence-to-sequence neural models widely used in abstractive summarization is exposure bias. To alleviate this problem, re-ranking systems have been applied in recent years. Despite some performance improvements, this approach remains underexplored. Previous works have mostly specified the rank through the ROUGE score and aligned candidate summaries, but there can be quite a large gap between the lexical overlap metric and semantic similarity. In this paper, we propose a novel training method in which a re-ranker balances the lexical and semantic quality. We further newly define false positives in ranking and present a strategy to reduce their influence. Experiments on the CNN/DailyMail and XSum datasets show that our method can estimate the meaning of summaries without seriously degrading the lexical aspect. More specifically, it achieves an 89.67 BERTScore on the CNN/DailyMail dataset, reaching new state-of-the-art performance. Our code is publicly available at https://github.com/jeewoo1025/BalSum.
Anthology ID:
2023.acl-short.56
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
637–647
Language:
URL:
https://aclanthology.org/2023.acl-short.56
DOI:
10.18653/v1/2023.acl-short.56
Bibkey:
Cite (ACL):
Jeewoo Sul and Yong Suk Choi. 2023. Balancing Lexical and Semantic Quality in Abstractive Summarization. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 637–647, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Balancing Lexical and Semantic Quality in Abstractive Summarization (Sul & Choi, ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.56.pdf
Video:
 https://aclanthology.org/2023.acl-short.56.mp4