Event Oriented Abstractive Summarization

Aafiya S Hussain, Talha Z Chafekar, Grishma Sharma, Deepak H Sharma


Abstract
Abstractive Summarization models are generally conditioned on the source article. This would generate a summary with the central theme of the article. However, it would not be possible to generate a summary focusing on specific key areas of the article. To solve this problem, we introduce a novel method for abstractive summarization. We aim to use a transformer to generate summaries which are more tailored to the events in the text by using event information. We extract events from text, perform generalized pooling to get a representation for these events and add an event attention block in the decoder to aid the transformer model in summarization. We carried out experiments on CNN / Daily Mail dataset and the BBC Extreme Summarization dataset. We achieve comparable results on both these datasets, with less training and better inclusion of event information in the summaries as shown by human evaluation scores.
Anthology ID:
2022.icon-main.14
Volume:
Proceedings of the 19th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2022
Address:
New Delhi, India
Editors:
Md. Shad Akhtar, Tanmoy Chakraborty
Venue:
ICON
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
99–108
Language:
URL:
https://aclanthology.org/2022.icon-main.14
DOI:
Bibkey:
Cite (ACL):
Aafiya S Hussain, Talha Z Chafekar, Grishma Sharma, and Deepak H Sharma. 2022. Event Oriented Abstractive Summarization. In Proceedings of the 19th International Conference on Natural Language Processing (ICON), pages 99–108, New Delhi, India. Association for Computational Linguistics.
Cite (Informal):
Event Oriented Abstractive Summarization (S Hussain et al., ICON 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.icon-main.14.pdf