SKILL: Structured Knowledge Infusion for Large Language Models

Fedor Moiseev, Zhe Dong, Enrique Alfonseca, Martin Jaggi


Abstract
Large language models (LLMs) have demonstrated human-level performance on a vast spectrum of natural language tasks. However, it is largely unexplored whether they can better internalize knowledge from a structured data, such as a knowledge graph, or from text. In this work, we propose a method to infuse structured knowledge into LLMs, by directly training T5 models on factual triples of knowledge graphs (KGs). We show that models pre-trained on Wikidata KG with our method outperform the T5 baselines on FreebaseQA and WikiHop, as well as the Wikidata-answerable subset of TriviaQA and NaturalQuestions. The models pre-trained on factual triples compare competitively with the ones on natural language sentences that contain the same knowledge. Trained on a smaller size KG, WikiMovies, we saw 3x improvement of exact match score on MetaQA task. The proposed method has an advantage that no alignment between the knowledge graph and text corpus is required in curating training data. This makes our method particularly useful when working with industry-scale knowledge graphs.
Anthology ID:
2022.naacl-main.113
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1581–1588
Language:
URL:
https://aclanthology.org/2022.naacl-main.113
DOI:
10.18653/v1/2022.naacl-main.113
Bibkey:
Cite (ACL):
Fedor Moiseev, Zhe Dong, Enrique Alfonseca, and Martin Jaggi. 2022. SKILL: Structured Knowledge Infusion for Large Language Models. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 1581–1588, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
SKILL: Structured Knowledge Infusion for Large Language Models (Moiseev et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.113.pdf
Video:
 https://aclanthology.org/2022.naacl-main.113.mp4
Data
C4KELMMetaQANatural QuestionsTriviaQAWikiHopWikiMovies