Yuval Varkel


2020

pdf bib
Pre-training Mention Representations in Coreference Models
Yuval Varkel | Amir Globerson
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

Collecting labeled data for coreference resolution is a challenging task, requiring skilled annotators. It is thus desirable to develop coreference resolution models that can make use of unlabeled data. Here we provide such an approach for the powerful class of neural coreference models. These models rely on representations of mentions, and we show these representations can be learned in a self-supervised manner towards improving resolution accuracy. We propose two self-supervised tasks that are closely related to coreference resolution and thus improve mention representation. Applying this approach to the GAP dataset results in new state of the arts results.
Search
Co-authors
Venues