TegTok: Augmenting Text Generation via Task-specific and Open-world Knowledge

Chao-Hong Tan, Jia-Chen Gu, Chongyang Tao, Zhen-Hua Ling, Can Xu, Huang Hu, Xiubo Geng, Daxin Jiang


Abstract
Generating natural and informative texts has been a long-standing problem in NLP. Much effort has been dedicated into incorporating pre-trained language models (PLMs) with various open-world knowledge, such as knowledge graphs or wiki pages. However, their ability to access and manipulate the task-specific knowledge is still limited on downstream tasks, as this type of knowledge is usually not well covered in PLMs and is hard to acquire. To address the problem, we propose augmenting TExt Generation via Task-specific and Open-world Knowledge (TegTok) in a unified framework. Our model selects knowledge entries from two types of knowledge sources through dense retrieval and then injects them into the input encoding and output decoding stages respectively on the basis of PLMs. With the help of these two types of knowledge, our model can learn what and how to generate. Experiments on two text generation tasks of dialogue generation and question generation, and on two datasets show that our method achieves better performance than various baseline models.
Anthology ID:
2022.findings-acl.125
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1597–1609
Language:
URL:
https://aclanthology.org/2022.findings-acl.125
DOI:
10.18653/v1/2022.findings-acl.125
Bibkey:
Cite (ACL):
Chao-Hong Tan, Jia-Chen Gu, Chongyang Tao, Zhen-Hua Ling, Can Xu, Huang Hu, Xiubo Geng, and Daxin Jiang. 2022. TegTok: Augmenting Text Generation via Task-specific and Open-world Knowledge. In Findings of the Association for Computational Linguistics: ACL 2022, pages 1597–1609, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
TegTok: Augmenting Text Generation via Task-specific and Open-world Knowledge (Tan et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.125.pdf
Code
 lxchtan/tegtok