EDC: Effective and Efficient Dialog Comprehension For Dialog State Tracking

Qifan Lu, Bhaskar Ramasubramanian, Radha Poovendran


Abstract
In Task-Oriented Dialog (TOD) systems, Dialog State Tracking (DST) structurally extracts information from user and system utterances, which can be further used for querying databases and forming responses to users. The two major categories of DST methods, sequential and independent methods, face trade-offs between accuracy and efficiency. To resolve this issue, we propose Effective and Efficient Dialog Comprehension (EDC), an alternative DST approach that leverages the tree structure of the dialog state. EDC predicts domains, slot names and slot values of the dialog state step-by-step for better accuracy, and efficiently encodes dialog contexts with causal attention patterns. We evaluate EDC on several popular TOD datasets and EDC is able to achieve state-of-the-art Joint Goal Accuracy (JGA). We also show theoretically and empirically that EDC is more efficient than model designs used by previous works.
Anthology ID:
2024.naacl-long.232
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4151–4165
Language:
URL:
https://aclanthology.org/2024.naacl-long.232
DOI:
Bibkey:
Cite (ACL):
Qifan Lu, Bhaskar Ramasubramanian, and Radha Poovendran. 2024. EDC: Effective and Efficient Dialog Comprehension For Dialog State Tracking. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 4151–4165, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
EDC: Effective and Efficient Dialog Comprehension For Dialog State Tracking (Lu et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.232.pdf
Copyright:
 2024.naacl-long.232.copyright.pdf