MoCoKGC: Momentum Contrast Entity Encoding for Knowledge Graph Completion

Qingyang Li, Yanru Zhong, Yuchu Qin


Abstract
In recent years, numerous studies have sought to enhance the capabilities of pretrained language models (PLMs) for Knowledge Graph Completion (KGC) tasks by integrating structural information from knowledge graphs. However, existing approaches have not effectively combined the structural attributes of knowledge graphs with the textual descriptions of entities to generate robust entity encodings.To address this issue, this paper proposes MoCoKGC (Momentum Contrast Entity Encoding for Knowledge Graph Completion), which incorporates three primary encoders: the entity-relation encoder, the entity encoder, and the momentum entity encoder. Momentum contrastive learning not only provides more negative samples but also allows for the gradual updating of entity encodings. Consequently, we reintroduce the generated entity encodings into the encoder to incorporate the graph’s structural information.Additionally, MoCoKGC enhances the inferential capabilities of the entity-relation encoder through deep prompts of relations. On the standard evaluation metric, Mean Reciprocal Rank (MRR), the MoCoKGC model demonstrates superior performance, achieving a 7.1% improvement on the WN18RR dataset and an 11% improvement on the Wikidata5M dataset, while also surpassing the current best model on the FB15k-237 dataset. Through a series of experiments, this paper thoroughly examines the role and contribution of each component and parameter of the model.
Anthology ID:
2024.emnlp-main.832
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14940–14952
Language:
URL:
https://aclanthology.org/2024.emnlp-main.832
DOI:
Bibkey:
Cite (ACL):
Qingyang Li, Yanru Zhong, and Yuchu Qin. 2024. MoCoKGC: Momentum Contrast Entity Encoding for Knowledge Graph Completion. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 14940–14952, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
MoCoKGC: Momentum Contrast Entity Encoding for Knowledge Graph Completion (Li et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.832.pdf
Software:
 2024.emnlp-main.832.software.zip
Data:
 2024.emnlp-main.832.data.zip