Autoregressive Entity Generation for End-to-End Task-Oriented Dialog

Guanhuan Huang, Xiaojun Quan, Qifan Wang


Abstract
Task-oriented dialog (TOD) systems are often required to interact with an external knowledge base (KB) to retrieve necessary entity (e.g., restaurants) information to support their response generation. Most current end-to-end TOD systems either retrieve the KB information explicitly or embed it into model parameters for implicit access. While the first approach demands scanning the KB at each turn of response generation, which is inefficient when the KB scales up, the second approach shows higher flexibility and efficiency. In either approach, the response shall contain attributes of the same entity, however the systems may generate a response with conflicting entities. To address this, we propose to generate the entity autoregressively before leveraging it to guide the response generation in an end-to-end system. To ensure entity consistency, we impose a trie constraint on the decoding of an entity. We also introduce a logit concatenation strategy to facilitate gradient backpropagation for end-to-end training. Experiments on MultiWOZ 2.1 single and CAMREST show that our system can generate more high-quality and entity-consistent responses in an end-to-end manner.
Anthology ID:
2022.coling-1.25
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
323–332
Language:
URL:
https://aclanthology.org/2022.coling-1.25
DOI:
Bibkey:
Cite (ACL):
Guanhuan Huang, Xiaojun Quan, and Qifan Wang. 2022. Autoregressive Entity Generation for End-to-End Task-Oriented Dialog. In Proceedings of the 29th International Conference on Computational Linguistics, pages 323–332, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Autoregressive Entity Generation for End-to-End Task-Oriented Dialog (Huang et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.25.pdf
Data
MultiWOZ