Crowd-sourcing annotation of complex NLU tasks: A case study of argumentative content annotation

Tamar Lavee, Lili Kotlerman, Matan Orbach, Yonatan Bilu, Michal Jacovi, Ranit Aharonov, Noam Slonim


Abstract
Recent advancements in machine reading and listening comprehension involve the annotation of long texts. Such tasks are typically time consuming, making crowd-annotations an attractive solution, yet their complexity often makes such a solution unfeasible. In particular, a major concern is that crowd annotators may be tempted to skim through long texts, and answer questions without reading thoroughly. We present a case study of adapting this type of task to the crowd. The task is to identify claims in a several minute long debate speech. We show that sentence-by-sentence annotation does not scale and that labeling only a subset of sentences is insufficient. Instead, we propose a scheme for effectively performing the full, complex task with crowd annotators, allowing the collection of large scale annotated datasets. We believe that the encountered challenges and pitfalls, as well as lessons learned, are relevant in general when collecting data for large scale natural language understanding (NLU) tasks.
Anthology ID:
D19-5905
Volume:
Proceedings of the First Workshop on Aggregating and Analysing Crowdsourced Annotations for NLP
Month:
November
Year:
2019
Address:
Hong Kong
Editors:
Silviu Paun, Dirk Hovy
Venue:
WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
29–38
Language:
URL:
https://aclanthology.org/D19-5905
DOI:
10.18653/v1/D19-5905
Bibkey:
Cite (ACL):
Tamar Lavee, Lili Kotlerman, Matan Orbach, Yonatan Bilu, Michal Jacovi, Ranit Aharonov, and Noam Slonim. 2019. Crowd-sourcing annotation of complex NLU tasks: A case study of argumentative content annotation. In Proceedings of the First Workshop on Aggregating and Analysing Crowdsourced Annotations for NLP, pages 29–38, Hong Kong. Association for Computational Linguistics.
Cite (Informal):
Crowd-sourcing annotation of complex NLU tasks: A case study of argumentative content annotation (Lavee et al., 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5905.pdf
Attachment:
 D19-5905.Attachment.pdf