What happens before and after: Multi-Event Commonsense in Event Coreference Resolution

Sahithya Ravi, Chris Tanner, Raymond Ng, Vered Shwartz


Abstract
Event coreference models cluster event mentions pertaining to the same real-world event. Recent models rely on contextualized representations to recognize coreference among lexically or contextually similar mentions. However, models typically fail to leverage commonsense inferences, which is particularly limiting for resolving lexically-divergent mentions. We propose a model that extends event mentions with temporal commonsense inferences. Given a complex sentence with multiple events, e.g., “the man killed his wife and got arrested”, with the target event “arrested”, our model generates plausible events that happen before the target event – such as “the police arrived”, and after it, such as “he was sentenced”. We show that incorporating such inferences into an existing event coreference model improves its performance, and we analyze the coreferences in which such temporal knowledge is required.
Anthology ID:
2023.eacl-main.125
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1708–1724
Language:
URL:
https://aclanthology.org/2023.eacl-main.125
DOI:
10.18653/v1/2023.eacl-main.125
Bibkey:
Cite (ACL):
Sahithya Ravi, Chris Tanner, Raymond Ng, and Vered Shwartz. 2023. What happens before and after: Multi-Event Commonsense in Event Coreference Resolution. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1708–1724, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
What happens before and after: Multi-Event Commonsense in Event Coreference Resolution (Ravi et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.125.pdf
Video:
 https://aclanthology.org/2023.eacl-main.125.mp4