Unifying Dual-Space Embedding for Entity Alignment via Contrastive Learning

Cunda Wang, Weihua Wang, Qiuyu Liang, Feilong Bao, Guanglai Gao


Abstract
Entity alignment (EA) aims to match identical entities across different knowledge graphs (KGs). Graph neural network-based entity alignment methods have achieved promising results in Euclidean space. However, KGs often contain complex local and hierarchical structures, which are hard to represent in a single space. In this paper, we propose a novel method named as UniEA, which unifies dual-space embedding to preserve the intrinsic structure of KGs. Specifically, we simultaneously learn graph structure embeddings in both Euclidean and hyperbolic spaces to maximize the consistency between embeddings in the two spaces. Moreover, we employ contrastive learning to mitigate the misalignment issues caused by similar entities, where embeddings of similar neighboring entities become too close. Extensive experiments on benchmark datasets demonstrate that our method achieves state-of-the-art performance in structure-based EA. Our code is available at https://github.com/wonderCS1213/UniEA.
Anthology ID:
2025.coling-main.209
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3110–3122
Language:
URL:
https://aclanthology.org/2025.coling-main.209/
DOI:
Bibkey:
Cite (ACL):
Cunda Wang, Weihua Wang, Qiuyu Liang, Feilong Bao, and Guanglai Gao. 2025. Unifying Dual-Space Embedding for Entity Alignment via Contrastive Learning. In Proceedings of the 31st International Conference on Computational Linguistics, pages 3110–3122, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Unifying Dual-Space Embedding for Entity Alignment via Contrastive Learning (Wang et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.209.pdf