Getting Better Dialogue Context for Knowledge Identification by Leveraging Document-level Topic Shift

Nhat Tran, Diane Litman


Abstract
To build a goal-oriented dialogue system that can generate responses given a knowledge base, identifying the relevant pieces of information to be grounded in is vital. When the number of documents in the knowledge base is large, retrieval approaches are typically used to identify the top relevant documents. However, most prior work simply uses an entire dialogue history to guide retrieval, rather than exploiting a dialogue’s topical structure. In this work, we examine the importance of building the proper contextualized dialogue history when document-level topic shifts are present. Our results suggest that excluding irrelevant turns from the dialogue history (e.g., excluding turns not grounded in the same document as the current turn) leads to better retrieval results. We also propose a cascading approach utilizing the topical nature of a knowledge-grounded conversation to further manipulate the dialogue history used as input to the retrieval models.
Anthology ID:
2022.sigdial-1.36
Volume:
Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
September
Year:
2022
Address:
Edinburgh, UK
Editors:
Oliver Lemon, Dilek Hakkani-Tur, Junyi Jessy Li, Arash Ashrafzadeh, Daniel Hernández Garcia, Malihe Alikhani, David Vandyke, Ondřej Dušek
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
368–375
Language:
URL:
https://aclanthology.org/2022.sigdial-1.36
DOI:
10.18653/v1/2022.sigdial-1.36
Bibkey:
Cite (ACL):
Nhat Tran and Diane Litman. 2022. Getting Better Dialogue Context for Knowledge Identification by Leveraging Document-level Topic Shift. In Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 368–375, Edinburgh, UK. Association for Computational Linguistics.
Cite (Informal):
Getting Better Dialogue Context for Knowledge Identification by Leveraging Document-level Topic Shift (Tran & Litman, SIGDIAL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.sigdial-1.36.pdf
Video:
 https://youtu.be/hpr3fXbWPVA
Data
Doc2DialMultiDoc2Dialdoc2dial