Knowledge Enhanced Fine-Tuning for Better Handling Unseen Entities in Dialogue Generation

Leyang Cui, Yu Wu, Shujie Liu, Yue Zhang


Abstract
Although pre-training models have achieved great success in dialogue generation, their performance drops dramatically when the input contains an entity that does not appear in pre-training and fine-tuning datasets (unseen entity). To address this issue, existing methods leverage an external knowledge base to generate appropriate responses. In real-world practical, the entity may not be included by the knowledge base or suffer from the precision of knowledge retrieval. To deal with this problem, instead of introducing knowledge base as the input, we force the model to learn a better semantic representation by predicting the information in the knowledge base, only based on the input context. Specifically, with the help of a knowledge base, we introduce two auxiliary training objectives: 1) Interpret Masked Word, which conjectures the meaning of the masked entity given the context; 2) Hypernym Generation, which predicts the hypernym of the entity based on the context. Experiment results on two dialogue corpus verify the effectiveness of our methods under both knowledge available and unavailable settings.
Anthology ID:
2021.emnlp-main.179
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2328–2337
Language:
URL:
https://aclanthology.org/2021.emnlp-main.179
DOI:
10.18653/v1/2021.emnlp-main.179
Bibkey:
Cite (ACL):
Leyang Cui, Yu Wu, Shujie Liu, and Yue Zhang. 2021. Knowledge Enhanced Fine-Tuning for Better Handling Unseen Entities in Dialogue Generation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2328–2337, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Knowledge Enhanced Fine-Tuning for Better Handling Unseen Entities in Dialogue Generation (Cui et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.179.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.179.mp4
Code
 nealcly/ke-blender
Data
Wizard of Wikipedia