Event Coreference Resolution with Non-Local Information

Jing Lu, Vincent Ng


Abstract
We present two extensions to a state-of-theart joint model for event coreference resolution, which involve incorporating (1) a supervised topic model for improving trigger detection by providing global context, and (2) a preprocessing module that seeks to improve event coreference by discarding unlikely candidate antecedents of an event mention using discourse contexts computed based on salient entities. The resulting model yields the best results reported to date on the KBP 2017 English and Chinese datasets.
Anthology ID:
2020.aacl-main.66
Volume:
Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing
Month:
December
Year:
2020
Address:
Suzhou, China
Editors:
Kam-Fai Wong, Kevin Knight, Hua Wu
Venue:
AACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
653–663
Language:
URL:
https://aclanthology.org/2020.aacl-main.66
DOI:
Bibkey:
Cite (ACL):
Jing Lu and Vincent Ng. 2020. Event Coreference Resolution with Non-Local Information. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 653–663, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Event Coreference Resolution with Non-Local Information (Lu & Ng, AACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.aacl-main.66.pdf