Guiding Extractive Summarization with Question-Answering Rewards

Kristjan Arumae, Fei Liu


Abstract
Highlighting while reading is a natural behavior for people to track salient content of a document. It would be desirable to teach an extractive summarizer to do the same. However, a major obstacle to the development of a supervised summarizer is the lack of ground-truth. Manual annotation of extraction units is cost-prohibitive, whereas acquiring labels by automatically aligning human abstracts and source documents can yield inferior results. In this paper we describe a novel framework to guide a supervised, extractive summarization system with question-answering rewards. We argue that quality summaries should serve as document surrogates to answer important questions, and such question-answer pairs can be conveniently obtained from human abstracts. The system learns to promote summaries that are informative, fluent, and perform competitively on question-answering. Our results compare favorably with those reported by strong summarization baselines as evaluated by automatic metrics and human assessors.
Anthology ID:
N19-1264
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2566–2577
Language:
URL:
https://aclanthology.org/N19-1264
DOI:
10.18653/v1/N19-1264
Bibkey:
Cite (ACL):
Kristjan Arumae and Fei Liu. 2019. Guiding Extractive Summarization with Question-Answering Rewards. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 2566–2577, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Guiding Extractive Summarization with Question-Answering Rewards (Arumae & Liu, NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/N19-1264.pdf
Presentation:
 N19-1264.Presentation.pdf
Code
 ucfnlp/summ_qa_rewards
Data
CNN/Daily Mail