KAUCUS - Knowledgeable User Simulators for Training Large Language Models

Kaustubh Dhole


Abstract
An effective multi-turn instruction-following assistant can be developed by creating a simulator that can generate useful interaction data. Apart from relying on its intrinsic weights, an ideal user simulator should also be able to bootstrap external knowledge rapidly in its raw form to simulate the multifarious diversity of text available over the internet. Previous user simulators generally lacked diversity, were mostly closed domain, and necessitated rigid schema making them inefficient to rapidly scale to incorporate external knowledge. In this regard, we introduce Kaucus, a Knowledge-Augmented User Simulator framework, to outline a process of creating diverse user simulators, that can seamlessly exploit external knowledge as well as benefit downstream assistant model training. Through two GPT-J based simulators viz., a Retrieval Augmented Simulator and a Summary Controlled Simulator we generate diverse simulator-assistant interactions. Through reward and preference model-based evaluations, we find that these interactions serve as useful training data and create more helpful downstream assistants. We also find that incorporating knowledge through retrieval augmentation or summary control helps create better assistants.
Anthology ID:
2024.scichat-1.5
Volume:
Proceedings of the 1st Workshop on Simulating Conversational Intelligence in Chat (SCI-CHAT 2024)
Month:
March
Year:
2024
Address:
St. Julians, Malta
Editors:
Yvette Graham, Qun Liu, Gerasimos Lampouras, Ignacio Iacobacci, Sinead Madden, Haider Khalid, Rameez Qureshi
Venues:
SCI-CHAT | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
53–65
Language:
URL:
https://aclanthology.org/2024.scichat-1.5
DOI:
Bibkey:
Cite (ACL):
Kaustubh Dhole. 2024. KAUCUS - Knowledgeable User Simulators for Training Large Language Models. In Proceedings of the 1st Workshop on Simulating Conversational Intelligence in Chat (SCI-CHAT 2024), pages 53–65, St. Julians, Malta. Association for Computational Linguistics.
Cite (Informal):
KAUCUS - Knowledgeable User Simulators for Training Large Language Models (Dhole, SCI-CHAT-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.scichat-1.5.pdf