Mind the Gap! Injecting Commonsense Knowledge for Abstractive Dialogue Summarization

Seungone Kim, Se June Joo, Hyungjoo Chae, Chaehyeong Kim, Seung-won Hwang, Jinyoung Yeo


Abstract
In this paper, we propose to leverage the unique characteristics of dialogues sharing commonsense knowledge across participants, to resolve the difficulties in summarizing them. We present SICK, a framework that uses commonsense inferences as additional context. Compared to previous work that solely relies on the input dialogue, SICK uses an external knowledge model to generate a rich set of commonsense inferences and selects the most probable one with a similarity-based selection method. Built upon SICK, SICK++ utilizes commonsense as supervision, where the task of generating commonsense inferences is added upon summarizing the dialogue in a multi-task learning setting. Experimental results show that with injected commonsense knowledge, our framework generates more informative and consistent summaries than existing methods.
Anthology ID:
2022.coling-1.548
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6285–6300
Language:
URL:
https://aclanthology.org/2022.coling-1.548
DOI:
Bibkey:
Cite (ACL):
Seungone Kim, Se June Joo, Hyungjoo Chae, Chaehyeong Kim, Seung-won Hwang, and Jinyoung Yeo. 2022. Mind the Gap! Injecting Commonsense Knowledge for Abstractive Dialogue Summarization. In Proceedings of the 29th International Conference on Computational Linguistics, pages 6285–6300, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Mind the Gap! Injecting Commonsense Knowledge for Abstractive Dialogue Summarization (Kim et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.548.pdf
Data
SAMSum Corpus