EventKE: Event-Enhanced Knowledge Graph Embedding

Zixuan Zhang, Hongwei Wang, Han Zhao, Hanghang Tong, Heng Ji


Abstract
Relations in most of the traditional knowledge graphs (KGs) only reflect static and factual connections, but fail to represent the dynamic activities and state changes about entities. In this paper, we emphasize the importance of incorporating events in KG representation learning, and propose an event-enhanced KG embedding model EventKE. Specifically, given the original KG, we first incorporate event nodes by building a heterogeneous network, where entity nodes and event nodes are distributed on the two sides of the network inter-connected by event argument links. We then use entity-entity relations from the original KG and event-event temporal links to inner-connect entity and event nodes respectively. We design a novel and effective attention-based message passing method, which is conducted on entity-entity, event-entity, and event-event relations to fuse the event information into KG embeddings. Experimental results on real-world datasets demonstrate that events can greatly improve the quality of the KG embeddings on multiple downstream tasks.
Anthology ID:
2021.findings-emnlp.120
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1389–1400
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.120
DOI:
10.18653/v1/2021.findings-emnlp.120
Bibkey:
Cite (ACL):
Zixuan Zhang, Hongwei Wang, Han Zhao, Hanghang Tong, and Heng Ji. 2021. EventKE: Event-Enhanced Knowledge Graph Embedding. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 1389–1400, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
EventKE: Event-Enhanced Knowledge Graph Embedding (Zhang et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.120.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.120.mp4