MEKER: Memory Efficient Knowledge Embedding Representation for Link Prediction and Question Answering

Viktoriia Chekalina, Anton Razzhigaev, Albert Sayapin, Evgeny Frolov, Alexander Panchenko


Abstract
Knowledge Graphs (KGs) are symbolically structured storages of facts. The KG embedding contains concise data used in NLP tasks requiring implicit information about the real world. Furthermore, the size of KGs that may be useful in actual NLP assignments is enormous, and creating embedding over it has memory cost issues. We represent KG as a 3rd-order binary tensor and move beyond the standard CP decomposition (CITATION) by using a data-specific generalized version of it (CITATION). The generalization of the standard CP-ALS algorithm allows obtaining optimization gradients without a backpropagation mechanism. It reduces the memory needed in training while providing computational benefits. We propose a MEKER, a memory-efficient KG embedding model, which yields SOTA-comparable performance on link prediction tasks and KG-based Question Answering.
Anthology ID:
2022.acl-srw.27
Original:
2022.acl-srw.27v1
Version 2:
2022.acl-srw.27v2
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Samuel Louvan, Andrea Madotto, Brielen Madureira
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
355–365
Language:
URL:
https://aclanthology.org/2022.acl-srw.27
DOI:
10.18653/v1/2022.acl-srw.27
Bibkey:
Cite (ACL):
Viktoriia Chekalina, Anton Razzhigaev, Albert Sayapin, Evgeny Frolov, and Alexander Panchenko. 2022. MEKER: Memory Efficient Knowledge Embedding Representation for Link Prediction and Question Answering. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 355–365, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
MEKER: Memory Efficient Knowledge Embedding Representation for Link Prediction and Question Answering (Chekalina et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-srw.27.pdf
Data
FB15k-237SimpleQuestionsWikidata5M