Event-Event Relation Extraction using Probabilistic Box Embedding

EunJeong Hwang, Jay-Yoon Lee, Tianyi Yang, Dhruvesh Patel, Dongxu Zhang, Andrew McCallum


Abstract
To understand a story with multiple events, it is important to capture the proper relations across these events. However, existing event relation extraction (ERE) framework regards it as a multi-class classification task and do not guarantee any coherence between different relation types, such as anti-symmetry. If a phone line “died” after “storm”, then it is obvious that the “storm” happened before the “died”. Current framework of event relation extraction do not guarantee this coherence and thus enforces it via constraint loss function (Wang et al., 2020). In this work, we propose to modify the underlying ERE model to guarantee coherence by representing each event as a box representation (BERE) without applying explicit constraints. From our experiments, BERE also shows stronger conjunctive constraint satisfaction while performing on par or better in F1 compared to previous models with constraint injection.
Anthology ID:
2022.acl-short.26
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
235–244
Language:
URL:
https://aclanthology.org/2022.acl-short.26
DOI:
10.18653/v1/2022.acl-short.26
Bibkey:
Cite (ACL):
EunJeong Hwang, Jay-Yoon Lee, Tianyi Yang, Dhruvesh Patel, Dongxu Zhang, and Andrew McCallum. 2022. Event-Event Relation Extraction using Probabilistic Box Embedding. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 235–244, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Event-Event Relation Extraction using Probabilistic Box Embedding (Hwang et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-short.26.pdf
Code
 iesl/ce2ere