Query Enhanced Knowledge-Intensive Conversation via Unsupervised Joint Modeling

Mingzhu Cai, Siqi Bao, Xin Tian, Huang He, Fan Wang, Hua Wu


Abstract
In this paper, we propose an unsupervised query enhanced approach for knowledge-intensive conversations, namely QKConv. There are three modules in QKConv: a query generator, an off-the-shelf knowledge selector, and a response generator. QKConv is optimized through joint training, which produces the response by exploring multiple candidate queries and leveraging corresponding selected knowledge. The joint training solely relies on the dialogue context and target response, getting exempt from extra query annotations or knowledge provenances. To evaluate the effectiveness of the proposed QKConv, we conduct experiments on three representative knowledge-intensive conversation datasets: conversational question-answering, task-oriented dialogue, and knowledge-grounded conversation. Experimental results reveal that QKConv performs better than all unsupervised methods across three datasets and achieves competitive performance compared to supervised methods.
Anthology ID:
2023.acl-long.97
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1730–1745
Language:
URL:
https://aclanthology.org/2023.acl-long.97
DOI:
10.18653/v1/2023.acl-long.97
Bibkey:
Cite (ACL):
Mingzhu Cai, Siqi Bao, Xin Tian, Huang He, Fan Wang, and Hua Wu. 2023. Query Enhanced Knowledge-Intensive Conversation via Unsupervised Joint Modeling. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1730–1745, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Query Enhanced Knowledge-Intensive Conversation via Unsupervised Joint Modeling (Cai et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.97.pdf
Video:
 https://aclanthology.org/2023.acl-long.97.mp4