Dual Attention Network for Cross-lingual Entity Alignment

Jian Sun, Yu Zhou, Chengqing Zong


Abstract
Cross-lingual Entity alignment is an essential part of building a knowledge graph, which can help integrate knowledge among different language knowledge graphs. In the real KGs, there exists an imbalance among the information in the same hierarchy of corresponding entities, which results in the heterogeneity of neighborhood structure, making this task challenging. To tackle this problem, we propose a dual attention network for cross-lingual entity alignment (DAEA). Specifically, our dual attention consists of relation-aware graph attention and hierarchical attention. The relation-aware graph attention aims at selectively aggregating multi-hierarchy neighborhood information to alleviate the difference of heterogeneity among counterpart entities. The hierarchical attention adaptively aggregates the low-hierarchy and the high-hierarchy information, which is beneficial to balance the neighborhood information of counterpart entities and distinguish non-counterpart entities with similar structures. Finally, we treat cross-lingual entity alignment as a process of linking prediction. Experimental results on three real-world cross-lingual entity alignment datasets have shown the effectiveness of DAEA.
Anthology ID:
2020.coling-main.284
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3190–3201
Language:
URL:
https://aclanthology.org/2020.coling-main.284
DOI:
10.18653/v1/2020.coling-main.284
Bibkey:
Cite (ACL):
Jian Sun, Yu Zhou, and Chengqing Zong. 2020. Dual Attention Network for Cross-lingual Entity Alignment. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3190–3201, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Dual Attention Network for Cross-lingual Entity Alignment (Sun et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.284.pdf
Data
DBP15K