PreCo: A Large-scale Dataset in Preschool Vocabulary for Coreference Resolution

Hong Chen, Zhenhua Fan, Hao Lu, Alan Yuille, Shu Rong


Abstract
We introduce PreCo, a large-scale English dataset for coreference resolution. The dataset is designed to embody the core challenges in coreference, such as entity representation, by alleviating the challenge of low overlap between training and test sets and enabling separated analysis of mention detection and mention clustering. To strengthen the training-test overlap, we collect a large corpus of 38K documents and 12.5M words which are mostly from the vocabulary of English-speaking preschoolers. Experiments show that with higher training-test overlap, error analysis on PreCo is more efficient than the one on OntoNotes, a popular existing dataset. Furthermore, we annotate singleton mentions making it possible for the first time to quantify the influence that a mention detector makes on coreference resolution performance. The dataset is freely available at https://preschool-lab.github.io/PreCo/.
Anthology ID:
D18-1016
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
172–181
Language:
URL:
https://aclanthology.org/D18-1016
DOI:
10.18653/v1/D18-1016
Bibkey:
Cite (ACL):
Hong Chen, Zhenhua Fan, Hao Lu, Alan Yuille, and Shu Rong. 2018. PreCo: A Large-scale Dataset in Preschool Vocabulary for Coreference Resolution. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 172–181, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
PreCo: A Large-scale Dataset in Preschool Vocabulary for Coreference Resolution (Chen et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1016.pdf
Video:
 https://aclanthology.org/D18-1016.mp4
Data
PreCoRACESQuAD