Bilateral Masking with prompt for Knowledge Graph Completion

Yonghui Kong, Cunhang Fan, Yujie Chen, Shuai Zhang, Zhao Lv, Jianhua Tao


Abstract
The pre-trained language model (PLM) has achieved significant success in the field of knowledge graph completion (KGC) by effectively modeling entity and relation descriptions. In recent studies, the research in this field has been categorized into methods based on word matching and sentence matching, with the former significantly lags behind. However, there is a critical issue in word matching methods, which is that these methods fail to obtain satisfactory single embedding representations for entities.To address this issue and enhance entity representation, we propose the Bilateral Masking with prompt for Knowledge Graph Completion (BMKGC) approach.Our methodology employs prompts to narrow the distance between the predicted entity and the known entity. Additionally, the BMKGC model incorporates a bi-encoder architecture, enabling simultaneous predictions at both the head and tail. Furthermore, we propose a straightforward technique to augment positive samples, mitigating the problem of degree bias present in knowledge graphs and thereby improving the model’s robustness. Experimental results conclusively demonstrate that BMKGC achieves state-of-the-art performance on the WN18RR dataset.
Anthology ID:
2024.findings-naacl.17
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
240–249
Language:
URL:
https://aclanthology.org/2024.findings-naacl.17
DOI:
10.18653/v1/2024.findings-naacl.17
Bibkey:
Cite (ACL):
Yonghui Kong, Cunhang Fan, Yujie Chen, Shuai Zhang, Zhao Lv, and Jianhua Tao. 2024. Bilateral Masking with prompt for Knowledge Graph Completion. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 240–249, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Bilateral Masking with prompt for Knowledge Graph Completion (Kong et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.17.pdf