Annotating Mentions Alone Enables Efficient Domain Adaptation for Coreference Resolution

Nupoor Gandhi, Anjalie Field, Emma Strubell


Abstract
Although recent neural models for coreference resolution have led to substantial improvements on benchmark datasets, it remains a challenge to successfully transfer these models to new target domains containing many out-of-vocabulary spans and requiring differing annotation schemes. Typical approaches involve continued training on annotated target-domain data, but obtaining annotations is costly and time-consuming. In this work, we show that adapting mention detection is the key component to successful domain adaptation of coreference models, rather than antecedent linking. We also show annotating mentions alone is nearly twice as fast as annotating full coreference chains. Based on these insights, we propose a method for efficiently adapting coreference models, which includes a high-precision mention detection objective and requires only mention annotations in the target domain. Extensive evaluation across three English coreference datasets: CoNLL-2012 (news/conversation), i2b2/VA (medical notes), and child welfare notes, reveals that our approach facilitates annotation-efficient transfer and results in a 7-14% improvement in average F1 without increasing annotator time.
Anthology ID:
2023.acl-long.588
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10543–10558
Language:
URL:
https://aclanthology.org/2023.acl-long.588
DOI:
10.18653/v1/2023.acl-long.588
Bibkey:
Cite (ACL):
Nupoor Gandhi, Anjalie Field, and Emma Strubell. 2023. Annotating Mentions Alone Enables Efficient Domain Adaptation for Coreference Resolution. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 10543–10558, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Annotating Mentions Alone Enables Efficient Domain Adaptation for Coreference Resolution (Gandhi et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.588.pdf
Video:
 https://aclanthology.org/2023.acl-long.588.mp4