Improving Event Coreference Resolution Using Document-level and Topic-level Information

Sheng Xu, Peifeng Li, Qiaoming Zhu


Abstract
Event coreference resolution (ECR) aims to cluster event mentions that refer to the same real-world events. Deep learning methods have achieved SOTA results on the ECR task. However, due to the encoding length limitation, previous methods either adopt classical pairwise models based on sentence-level context or split each document into multiple chunks and encode them separately. They failed to capture the interactions and contextual cues among those long-distance event mentions. Besides, high-level information, such as event topics, is rarely considered to enhance representation learning for ECR. To address the above two issues, we first apply a Longformer-based encoder to obtain the document-level embeddings and an encoder with a trigger-mask mechanism to learn sentence-level embeddings based on local context. In addition, we propose an event topic generator to infer the latent topic-level representations. Finally, using the above event embeddings, we employ a multiple tensor matching method to capture their interactions at the document, sentence, and topic levels. Experimental results on the KBP 2017 dataset show that our model outperforms the SOTA baselines.
Anthology ID:
2022.emnlp-main.454
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6765–6775
Language:
URL:
https://aclanthology.org/2022.emnlp-main.454
DOI:
10.18653/v1/2022.emnlp-main.454
Bibkey:
Cite (ACL):
Sheng Xu, Peifeng Li, and Qiaoming Zhu. 2022. Improving Event Coreference Resolution Using Document-level and Topic-level Information. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 6765–6775, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Improving Event Coreference Resolution Using Document-level and Topic-level Information (Xu et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.454.pdf