Mind the Gap Between Conversations for Improved Long-Term Dialogue Generation

Qiang Zhang, Jason Naradowsky, Yusuke Miyao


Abstract
Knowing how to end and resume conversations over time is a natural part of communication, allowing for discussions to span weeks, months, or years. The duration of gaps between conversations dictates which topics are relevant and which questions to ask, and dialogue systems which do not explicitly model time may generate responses that are unnatural. In this work we explore the idea of making dialogue models aware of time, and present GapChat, a multi-session dialogue dataset in which the time between each session varies. While the dataset is constructed in real-time, progress on events in speakers’ lives is simulated in order to create realistic dialogues occurring across a long timespan. We expose time information to the model and compare different representations of time and event progress. In human evaluation we show that time-aware models perform better in metrics that judge the relevance of the chosen topics and the information gained from the conversation.
Anthology ID:
2023.findings-emnlp.720
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10735–10762
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.720
DOI:
10.18653/v1/2023.findings-emnlp.720
Bibkey:
Cite (ACL):
Qiang Zhang, Jason Naradowsky, and Yusuke Miyao. 2023. Mind the Gap Between Conversations for Improved Long-Term Dialogue Generation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 10735–10762, Singapore. Association for Computational Linguistics.
Cite (Informal):
Mind the Gap Between Conversations for Improved Long-Term Dialogue Generation (Zhang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.720.pdf