Syntax Controlled Knowledge Graph-to-Text Generation with Order and Semantic Consistency

Jin Liu, Chongfeng Fan, Zhou Fengyu, Huijuan Xu


Abstract
The knowledge graph (KG) stores a large amount of structural knowledge, while it is not easy for direct human understanding. Knowledge graph-to-text (KG-to-text) generation aims to generate easy-to-understand sentences from the KG, and at the same time, maintains semantic consistency between generated sentences and the KG. Existing KG-to-text generation methods phrase this task as a sequence-to-sequence generation task with linearized KG as input and consider the consistency issue of the generated texts and KG through a simple selection between decoded sentence word and KG node word at each time step. However, the linearized KG order is obtained through a heuristic search without data-driven optimization. In this paper, we optimize the knowledge description order prediction under the order supervision extracted from the caption and further enhance the consistency of the generated sentences and KG through syntactic and semantic regularization. We incorporate the Part-of-Speech (POS) syntactic tags to constrain the positions to copy words from the KG and employ a semantic context scoring function to evaluate the semantic fitness for each word in its local context when decoding each word in the generated sentence. Extensive experiments are conducted on two datasets, WebNLG and DART, and achieve state-of-the-art performances. Our code is now public available.
Anthology ID:
2022.findings-naacl.95
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1278–1291
Language:
URL:
https://aclanthology.org/2022.findings-naacl.95
DOI:
10.18653/v1/2022.findings-naacl.95
Bibkey:
Cite (ACL):
Jin Liu, Chongfeng Fan, Zhou Fengyu, and Huijuan Xu. 2022. Syntax Controlled Knowledge Graph-to-Text Generation with Order and Semantic Consistency. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1278–1291, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Syntax Controlled Knowledge Graph-to-Text Generation with Order and Semantic Consistency (Liu et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.95.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.95.mp4
Code
 lemonqc/kg2text
Data
DART