Ontology-guided Knowledge Graph Construction from Maintenance Short Texts

Zeno Cauter, Nikolay Yakovets


Abstract
Large-scale knowledge graph construction remains infeasible since it requires significant human-expert involvement. Further complications arise when building graphs from domain-specific data due to their unique vocabularies and associated contexts. In this work, we demonstrate the ability of open-source large language models (LLMs), such as Llama-2 and Llama-3, to extract facts from domain-specific Maintenance Short Texts (MSTs). We employ an approach which combines ontology-guided triplet extraction and in-context learning. By using only 20 semantically similar examples with the Llama-3-70B-Instruct model, we achieve performance comparable to previous methods that relied on fine-tuning techniques like SpERT and REBEL. This indicates that domain-specific fact extraction can be accomplished through inference alone, requiring minimal labeled data. This opens up possibilities for effective and efficient semi-automated knowledge graph construction for domain-specific data.
Anthology ID:
2024.kallm-1.8
Volume:
Proceedings of the 1st Workshop on Knowledge Graphs and Large Language Models (KaLLM 2024)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Russa Biswas, Lucie-Aimée Kaffee, Oshin Agarwal, Pasquale Minervini, Sameer Singh, Gerard de Melo
Venues:
KaLLM | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
75–84
Language:
URL:
https://aclanthology.org/2024.kallm-1.8
DOI:
10.18653/v1/2024.kallm-1.8
Bibkey:
Cite (ACL):
Zeno Cauter and Nikolay Yakovets. 2024. Ontology-guided Knowledge Graph Construction from Maintenance Short Texts. In Proceedings of the 1st Workshop on Knowledge Graphs and Large Language Models (KaLLM 2024), pages 75–84, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Ontology-guided Knowledge Graph Construction from Maintenance Short Texts (Cauter & Yakovets, KaLLM-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.kallm-1.8.pdf