Sequence-to-Sequence Knowledge Graph Completion and Question Answering

Apoorv Saxena, Adrian Kochsiek, Rainer Gemulla


Abstract
Knowledge graph embedding (KGE) models represent each entity and relation of a knowledge graph (KG) with low-dimensional embedding vectors. These methods have recently been applied to KG link prediction and question answering over incomplete KGs (KGQA). KGEs typically create an embedding for each entity in the graph, which results in large model sizes on real-world graphs with millions of entities. For downstream tasks these atomic entity representations often need to be integrated into a multi stage pipeline, limiting their utility. We show that an off-the-shelf encoder-decoder Transformer model can serve as a scalable and versatile KGE model obtaining state-of-the-art results for KG link prediction and incomplete KG question answering. We achieve this by posing KG link prediction as a sequence-to-sequence task and exchange the triple scoring approach taken by prior KGE methods with autoregressive decoding. Such a simple but powerful method reduces the model size up to 98% compared to conventional KGE models while keeping inference time tractable. After finetuning this model on the task of KGQA over incomplete KGs, our approach outperforms baselines on multiple large-scale datasets without extensive hyperparameter tuning.
Anthology ID:
2022.acl-long.201
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2814–2828
Language:
URL:
https://aclanthology.org/2022.acl-long.201
DOI:
10.18653/v1/2022.acl-long.201
Bibkey:
Cite (ACL):
Apoorv Saxena, Adrian Kochsiek, and Rainer Gemulla. 2022. Sequence-to-Sequence Knowledge Graph Completion and Question Answering. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2814–2828, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Sequence-to-Sequence Knowledge Graph Completion and Question Answering (Saxena et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.201.pdf
Software:
 2022.acl-long.201.software.zip
Video:
 https://aclanthology.org/2022.acl-long.201.mp4
Code
 apoorvumang/kgt5
Data
ComplexWebQuestionsMetaQAOGB-LSCWebQuestionsWebQuestionsSPWikiMoviesWikidata5M