Jointly Learning Salience and Redundancy by Adaptive Sentence Reranking for Extractive Summarization

Zhang Ximing, Liu Ruifang


Abstract
Extractive text summarization seeks to extract indicative sentences from a source document andassemble them to form a summary. Selecting salient but not redundant sentences has alwaysbeen the main challenge. Unlike the previous two-stage strategies this paper presents a unifiedend-to-end model learning to rerank the sentences by modeling salience and redundancy simul-taneously. Through this ranking mechanism our method can improve the quality of the overall candidate summary by giving higher scores to sentences that can bring more novel informa-tion. We first design a summary-level measure to evaluate the cumulating gain of each candidate summaries. Then we propose an adaptive training objective to rerank the sentences aiming atobtaining a summary with a high summary-level score. The experimental results and evalua-tion show that our method outperforms the strong baselines on three datasets and further booststhe quality of candidate summaries which intensely indicate the effectiveness of the proposed framework.
Anthology ID:
2021.ccl-1.85
Volume:
Proceedings of the 20th Chinese National Conference on Computational Linguistics
Month:
August
Year:
2021
Address:
Huhhot, China
Editors:
Sheng Li (李生), Maosong Sun (孙茂松), Yang Liu (刘洋), Hua Wu (吴华), Kang Liu (刘康), Wanxiang Che (车万翔), Shizhu He (何世柱), Gaoqi Rao (饶高琦)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
952–963
Language:
English
URL:
https://aclanthology.org/2021.ccl-1.85
DOI:
Bibkey:
Cite (ACL):
Zhang Ximing and Liu Ruifang. 2021. Jointly Learning Salience and Redundancy by Adaptive Sentence Reranking for Extractive Summarization. In Proceedings of the 20th Chinese National Conference on Computational Linguistics, pages 952–963, Huhhot, China. Chinese Information Processing Society of China.
Cite (Informal):
Jointly Learning Salience and Redundancy by Adaptive Sentence Reranking for Extractive Summarization (Ximing & Ruifang, CCL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.ccl-1.85.pdf
Data
WikiHow