Improving Span Representation for Domain-adapted Coreference Resolution

Nupoor Gandhi, Anjalie Field, Yulia Tsvetkov


Abstract
Recent work has shown fine-tuning neural coreference models can produce strong performance when adapting to different domains. However, at the same time, this can require a large amount of annotated target examples. In this work, we focus on supervised domain adaptation for clinical notes, proposing the use of concept knowledge to more efficiently adapt coreference models to a new domain. We develop methods to improve the span representations via (1) a retrofitting loss to incentivize span representations to satisfy a knowledge-based distance function and (2) a scaffolding loss to guide the recovery of knowledge from the span representation. By integrating these losses, our model is able to improve our baseline precision and F-1 score. In particular, we show that incorporating knowledge with end-to-end coreference models results in better performance on the most challenging, domain-specific spans.
Anthology ID:
2021.crac-1.13
Volume:
Proceedings of the Fourth Workshop on Computational Models of Reference, Anaphora and Coreference
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Venues:
CRAC | EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
121–131
Language:
URL:
https://aclanthology.org/2021.crac-1.13
DOI:
10.18653/v1/2021.crac-1.13
Bibkey:
Cite (ACL):
Nupoor Gandhi, Anjalie Field, and Yulia Tsvetkov. 2021. Improving Span Representation for Domain-adapted Coreference Resolution. In Proceedings of the Fourth Workshop on Computational Models of Reference, Anaphora and Coreference, pages 121–131, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Improving Span Representation for Domain-adapted Coreference Resolution (Gandhi et al., CRAC 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.crac-1.13.pdf
Code
 nupoorgandhi/i2b2-coref-public