š¯’«2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation

Qipeng Guo, Zhijing Jin, Ning Dai, Xipeng Qiu, Xiangyang Xue, David Wipf, Zheng Zhang


Abstract
Text verbalization of knowledge graphs is an important problem with wide application to natural language generation (NLG) systems. It is challenging because the generated text not only needs to be grammatically correct (fluency), but also has to contain the given structured knowledge input (relevance) and meet some other criteria. We develop a plan-and-pretrain approach, š¯’«2, which consists of a relational graph convolutional network (RGCN) planner and the pretrained sequence-tosequence (Seq2Seq) model T5. Specifically, the R-GCN planner first generates an order of the knowledge graph triplets, corresponding to the order that they will be mentioned in text, and then T5 produces the surface realization of the given plan. In the WebNLG+ 2020 Challenge, our submission ranked in 1st place on all automatic and human evaluation criteria of the English RDF-to-text generation task.
Anthology ID:
2020.webnlg-1.10
Volume:
Proceedings of the 3rd International Workshop on Natural Language Generation from the Semantic Web (WebNLG+)
Month:
12
Year:
2020
Address:
Dublin, Ireland (Virtual)
Editors:
Thiago Castro Ferreira, Claire Gardent, Nikolai Ilinykh, Chris van der Lee, Simon Mille, Diego Moussallem, Anastasia Shimorina
Venue:
WebNLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
100ā€“106
Language:
URL:
https://aclanthology.org/2020.webnlg-1.10
DOI:
Bibkey:
Cite (ACL):
Qipeng Guo, Zhijing Jin, Ning Dai, Xipeng Qiu, Xiangyang Xue, David Wipf, and Zheng Zhang. 2020. š¯’«2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation. In Proceedings of the 3rd International Workshop on Natural Language Generation from the Semantic Web (WebNLG+), pages 100ā€“106, Dublin, Ireland (Virtual). Association for Computational Linguistics.
Cite (Informal):
š¯’«2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation (Guo et al., WebNLG 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.webnlg-1.10.pdf
Data
WebNLG