Chenyu Qiu
2024
Joint Pre-Encoding Representation and Structure Embedding for Efficient and Low-Resource Knowledge Graph Completion
Chenyu Qiu
|
Pengjiang Qian
|
Chuang Wang
|
Jian Yao
|
Li Liu
|
Fang Wei
|
Eddie Y.k. Eddie
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Knowledge graph completion (KGC) aims to infer missing or incomplete parts in knowledge graph. The existing models are generally divided into structure-based and description-based models, among description-based models often require longer training and inference times as well as increased memory usage. In this paper, we propose Pre-Encoded Masked Language Model (PEMLM) to efficiently solve KGC problem. By encoding textual descriptions into semantic representations before training, the necessary resources are significantly reduced. Furthermore, we introduce a straightforward but effective fusion framework to integrate structural embedding with pre-encoded semantic description, which enhances the model’s prediction performance on 1-N relations. The experimental results demonstrate that our proposed strategy attains state-of-the-art performance on the WN18RR (MRR+5.4% and Hits@1+6.4%) and UMLS datasets. Compared to existing models, we have increased inference speed by 30x and reduced training memory by approximately 60%.
Search
Co-authors
- Pengjiang Qian 1
- Chuang Wang 1
- Jian Yao 1
- Li Liu 1
- Fang Wei 1
- show all...