Injecting Entity Types into Entity-Guided Text Generation

Xiangyu Dong, Wenhao Yu, Chenguang Zhu, Meng Jiang


Abstract
Recent successes in deep generative modeling have led to significant advances in natural language generation (NLG). Incorporating entities into neural generation models has demonstrated great improvements by assisting to infer the summary topic and to generate coherent content. To enhance the role of entity in NLG, in this paper, we aim to model the entity type in the decoding phase to generate contextual words accurately. We develop a novel NLG model to produce a target sequence based on a given list of entities. Our model has a multi-step decoder that injects the entity types into the process of entity mention generation. Experiments on two public news datasets demonstrate type injection performs better than existing type embedding concatenation baselines.
Anthology ID:
2021.emnlp-main.56
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
734–741
Language:
URL:
https://aclanthology.org/2021.emnlp-main.56
DOI:
10.18653/v1/2021.emnlp-main.56
Bibkey:
Cite (ACL):
Xiangyu Dong, Wenhao Yu, Chenguang Zhu, and Meng Jiang. 2021. Injecting Entity Types into Entity-Guided Text Generation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 734–741, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Injecting Entity Types into Entity-Guided Text Generation (Dong et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.56.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.56.mp4
Code
 wyu97/InjType +  additional community code
Data
New York Times Annotated Corpus