Contrastive Representation Learning for Cross-Document Coreference Resolution of Events and Entities

Benjamin Hsu, Graham Horwood


Abstract
Identifying related entities and events within and across documents is fundamental to natural language understanding. We present an approach to entity and event coreference resolution utilizing contrastive representation learning. Earlier state-of-the-art methods have formulated this problem as a binary classification problem and leveraged large transformers in a cross-encoder architecture to achieve their results. For large collections of documents and corresponding set of n mentions, the necessity of performing n2 transformer computations in these earlier approaches can be computationally intensive. We show that it is possible to reduce this burden by applying contrastive learning techniques that only require n transformer computations at inference time. Our method achieves state-of-the-art results on a number of key metrics on the ECB+ corpus and is competitive on others.
Anthology ID:
2022.naacl-main.267
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3644–3655
Language:
URL:
https://aclanthology.org/2022.naacl-main.267
DOI:
10.18653/v1/2022.naacl-main.267
Bibkey:
Cite (ACL):
Benjamin Hsu and Graham Horwood. 2022. Contrastive Representation Learning for Cross-Document Coreference Resolution of Events and Entities. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 3644–3655, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Contrastive Representation Learning for Cross-Document Coreference Resolution of Events and Entities (Hsu & Horwood, NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.267.pdf
Video:
 https://aclanthology.org/2022.naacl-main.267.mp4
Data
ECB+