A Simple Unsupervised Approach for Coreference Resolution using Rule-based Weak Supervision

Alessandro Stolfo, Chris Tanner, Vikram Gupta, Mrinmaya Sachan


Abstract
Labeled data for the task of Coreference Resolution is a scarce resource, requiring significant human effort. While state-of-the-art coreference models rely on such data, we propose an approach that leverages an end-to-end neural model in settings where labeled data is unavailable. Specifically, using weak supervision, we transfer the linguistic knowledge encoded by Stanford?s rule-based coreference system to the end-to-end model, which jointly learns rich, contextualized span representations and coreference chains. Our experiments on the English OntoNotes corpus demonstrate that our approach effectively benefits from the noisy coreference supervision, producing an improvement over Stanford?s rule-based system (+3.7 F1) and outperforming the previous best unsupervised model (+0.9 F1). Additionally, we validate the efficacy of our method on two other datasets: PreCo and Litbank (+2.5 and +5 F1 on Stanford’s system, respectively).
Anthology ID:
2022.starsem-1.7
Volume:
Proceedings of the 11th Joint Conference on Lexical and Computational Semantics
Month:
July
Year:
2022
Address:
Seattle, Washington
Editors:
Vivi Nastase, Ellie Pavlick, Mohammad Taher Pilehvar, Jose Camacho-Collados, Alessandro Raganato
Venue:
*SEM
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
79–88
Language:
URL:
https://aclanthology.org/2022.starsem-1.7
DOI:
10.18653/v1/2022.starsem-1.7
Bibkey:
Cite (ACL):
Alessandro Stolfo, Chris Tanner, Vikram Gupta, and Mrinmaya Sachan. 2022. A Simple Unsupervised Approach for Coreference Resolution using Rule-based Weak Supervision. In Proceedings of the 11th Joint Conference on Lexical and Computational Semantics, pages 79–88, Seattle, Washington. Association for Computational Linguistics.
Cite (Informal):
A Simple Unsupervised Approach for Coreference Resolution using Rule-based Weak Supervision (Stolfo et al., *SEM 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.starsem-1.7.pdf
Data
PreCo