On Masked Language Models for Contextual Link Prediction

Angus Brayne, Maciej Wiatrak, Dane Corneil


Abstract
In the real world, many relational facts require context; for instance, a politician holds a given elected position only for a particular timespan. This context (the timespan) is typically ignored in knowledge graph link prediction tasks, or is leveraged by models designed specifically to make use of it (i.e. n-ary link prediction models). Here, we show that the task of n-ary link prediction is easily performed using language models, applied with a basic method for constructing cloze-style query sentences. We introduce a pre-training methodology based around an auxiliary entity-linked corpus that outperforms other popular pre-trained models like BERT, even with a smaller model. This methodology also enables n-ary link prediction without access to any n-ary training set, which can be invaluable in circumstances where expensive and time-consuming curation of n-ary knowledge graphs is not feasible. We achieve state-of-the-art performance on the primary n-ary link prediction dataset WD50K and on WikiPeople facts that include literals - typically ignored by knowledge graph embedding methods.
Anthology ID:
2022.deelio-1.9
Volume:
Proceedings of Deep Learning Inside Out (DeeLIO 2022): The 3rd Workshop on Knowledge Extraction and Integration for Deep Learning Architectures
Month:
May
Year:
2022
Address:
Dublin, Ireland and Online
Editors:
Eneko Agirre, Marianna Apidianaki, Ivan Vulić
Venue:
DeeLIO
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
87–99
Language:
URL:
https://aclanthology.org/2022.deelio-1.9
DOI:
10.18653/v1/2022.deelio-1.9
Bibkey:
Cite (ACL):
Angus Brayne, Maciej Wiatrak, and Dane Corneil. 2022. On Masked Language Models for Contextual Link Prediction. In Proceedings of Deep Learning Inside Out (DeeLIO 2022): The 3rd Workshop on Knowledge Extraction and Integration for Deep Learning Architectures, pages 87–99, Dublin, Ireland and Online. Association for Computational Linguistics.
Cite (Informal):
On Masked Language Models for Contextual Link Prediction (Brayne et al., DeeLIO 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.deelio-1.9.pdf
Video:
 https://aclanthology.org/2022.deelio-1.9.mp4