Using Large Language Models for Zero-Shot Natural Language Generation from Knowledge Graphs

Agnes Axelsson, Gabriel Skantze


Abstract
In any system that uses structured knowledge graph (KG) data as its underlying knowledge representation, KG-to-text generation is a useful tool for turning parts of the graph data into text that can be understood by humans. Recent work has shown that models that make use of pretraining on large amounts of text data can perform well on the KG-to-text task, even with relatively little training data on the specific graph-to-text task. In this paper, we build on this concept by using large language models to perform zero-shot generation based on nothing but the model’s understanding of the triple structure from what it can read. We show that ChatGPT achieves near state-of-the-art performance on some measures of the WebNLG 2020 challenge, but falls behind on others. Additionally, we compare factual, counter-factual and fictional statements, and show that there is a significant connection between what the LLM already knows about the data it is parsing and the quality of the output text.
Anthology ID:
2023.mmnlg-1.5
Volume:
Proceedings of the Workshop on Multimodal, Multilingual Natural Language Generation and Multilingual WebNLG Challenge (MM-NLG 2023)
Month:
September
Year:
2023
Address:
Prague, Czech Republic
Editors:
Albert Gatt, Claire Gardent, Liam Cripwell, Anya Belz, Claudia Borg, Aykut Erdem, Erkut Erdem
Venues:
MMNLG | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
39–54
Language:
URL:
https://aclanthology.org/2023.mmnlg-1.5
DOI:
Bibkey:
Cite (ACL):
Agnes Axelsson and Gabriel Skantze. 2023. Using Large Language Models for Zero-Shot Natural Language Generation from Knowledge Graphs. In Proceedings of the Workshop on Multimodal, Multilingual Natural Language Generation and Multilingual WebNLG Challenge (MM-NLG 2023), pages 39–54, Prague, Czech Republic. Association for Computational Linguistics.
Cite (Informal):
Using Large Language Models for Zero-Shot Natural Language Generation from Knowledge Graphs (Axelsson & Skantze, MMNLG-WS 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.mmnlg-1.5.pdf