Entity-based De-noising Modeling for Controllable Dialogue Summarization

Zhengyuan Liu, Nancy Chen


Abstract
Although fine-tuning pre-trained backbones produces fluent and grammatically-correct text in various language generation tasks, factual consistency in abstractive summarization remains challenging. This challenge is especially thorny for dialogue summarization, where neural models often make inaccurate associations between personal named entities and their respective actions. To tackle this type of hallucination, we present an entity-based de-noising model via text perturbation on reference summaries. We then apply this proposed approach in beam search validation, conditional training augmentation, and inference post-editing. Experimental results on the SAMSum corpus show that state-of-the-art models equipped with our proposed method achieve generation quality improvement in both automatic evaluation and human assessment.
Anthology ID:
2022.sigdial-1.40
Volume:
Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
September
Year:
2022
Address:
Edinburgh, UK
Editors:
Oliver Lemon, Dilek Hakkani-Tur, Junyi Jessy Li, Arash Ashrafzadeh, Daniel Hernández Garcia, Malihe Alikhani, David Vandyke, Ondřej Dušek
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
407–418
Language:
URL:
https://aclanthology.org/2022.sigdial-1.40
DOI:
10.18653/v1/2022.sigdial-1.40
Bibkey:
Cite (ACL):
Zhengyuan Liu and Nancy Chen. 2022. Entity-based De-noising Modeling for Controllable Dialogue Summarization. In Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 407–418, Edinburgh, UK. Association for Computational Linguistics.
Cite (Informal):
Entity-based De-noising Modeling for Controllable Dialogue Summarization (Liu & Chen, SIGDIAL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.sigdial-1.40.pdf
Video:
 https://youtu.be/5FTuOs7nTK4