Active Learning for Coreference Resolution using Discrete Annotation

Belinda Z. Li, Gabriel Stanovsky, Luke Zettlemoyer


Abstract
We improve upon pairwise annotation for active learning in coreference resolution, by asking annotators to identify mention antecedents if a presented mention pair is deemed not coreferent. This simple modification, when combined with a novel mention clustering algorithm for selecting which examples to label, is much more efficient in terms of the performance obtained per annotation budget. In experiments with existing benchmark coreference datasets, we show that the signal from this additional question leads to significant performance gains per human-annotation hour. Future work can use our annotation protocol to effectively develop coreference models for new domains. Our code is publicly available.
Anthology ID:
2020.acl-main.738
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8320–8331
Language:
URL:
https://aclanthology.org/2020.acl-main.738
DOI:
10.18653/v1/2020.acl-main.738
Bibkey:
Cite (ACL):
Belinda Z. Li, Gabriel Stanovsky, and Luke Zettlemoyer. 2020. Active Learning for Coreference Resolution using Discrete Annotation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 8320–8331, Online. Association for Computational Linguistics.
Cite (Informal):
Active Learning for Coreference Resolution using Discrete Annotation (Li et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.738.pdf
Video:
 http://slideslive.com/38928722
Code
 belindal/discrete-active-learning-coref