Ontologically Faithful Generation of Non-Player Character Dialogues

Nathaniel Weir, Ryan Thomas, Randolph d’Amore, Kellie Hill, Benjamin Van Durme, Harsh Jhamtani


Abstract
We introduce a language generation dataset grounded in a popular video game. KNUDGE (**KN**owledge Constrained **U**ser-NPC **D**ialogue **GE**neration) requires models to produce trees of dialogue between video game characters that accurately reflect quest and entity specifications stated in natural language. KNUDGE is constructed from side quest dialogues drawn directly from game data of Obsidian Entertainment’s _The Outer Worlds_, leading to real-world complexities in generation: (1) utterances must remain faithful to the game lore, including character personas and backstories; (2) a dialogue must accurately reveal new quest details to the human player; and (3) dialogues are large trees as opposed to linear chains of utterances. We report results for a set of neural generation models using supervised and in-context learning techniques; we find competent performance but room for future work addressing the challenges of creating realistic, game-quality dialogues.
Anthology ID:
2024.emnlp-main.520
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9212–9242
Language:
URL:
https://aclanthology.org/2024.emnlp-main.520
DOI:
10.18653/v1/2024.emnlp-main.520
Bibkey:
Cite (ACL):
Nathaniel Weir, Ryan Thomas, Randolph d’Amore, Kellie Hill, Benjamin Van Durme, and Harsh Jhamtani. 2024. Ontologically Faithful Generation of Non-Player Character Dialogues. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 9212–9242, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Ontologically Faithful Generation of Non-Player Character Dialogues (Weir et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.520.pdf
Software:
 2024.emnlp-main.520.software.zip