Building Low-Resource NER Models Using Non-Speaker Annotations

Tatiana Tsygankova, Francesca Marini, Stephen Mayhew, Dan Roth


Abstract
In low-resource natural language processing (NLP), the key problems are a lack of target language training data, and a lack of native speakers to create it. Cross-lingual methods have had notable success in addressing these concerns, but in certain common circumstances, such as insufficient pre-training corpora or languages far from the source language, their performance suffers. In this work we propose a complementary approach to building low-resource Named Entity Recognition (NER) models using “non-speaker” (NS) annotations, provided by annotators with no prior experience in the target language. We recruit 30 participants in a carefully controlled annotation experiment with Indonesian, Russian, and Hindi. We show that use of NS annotators produces results that are consistently on par or better than cross-lingual methods built on modern contextual representations, and have the potential to outperform with additional effort. We conclude with observations of common annotation patterns and recommended implementation practices, and motivate how NS annotations can be used in addition to prior methods for improved performance.
Anthology ID:
2021.dash-1.11
Volume:
Proceedings of the Second Workshop on Data Science with Human in the Loop: Language Advances
Month:
June
Year:
2021
Address:
Online
Venues:
DaSH | NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
62–69
Language:
URL:
https://aclanthology.org/2021.dash-1.11
DOI:
10.18653/v1/2021.dash-1.11
Bibkey:
Cite (ACL):
Tatiana Tsygankova, Francesca Marini, Stephen Mayhew, and Dan Roth. 2021. Building Low-Resource NER Models Using Non-Speaker Annotations. In Proceedings of the Second Workshop on Data Science with Human in the Loop: Language Advances, pages 62–69, Online. Association for Computational Linguistics.
Cite (Informal):
Building Low-Resource NER Models Using Non-Speaker Annotations (Tsygankova et al., DaSH 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.dash-1.11.pdf