Quantifying and Avoiding Unfair Qualification Labour in Crowdsourcing

Jonathan K. Kummerfeld


Abstract
Extensive work has argued in favour of paying crowd workers a wage that is at least equivalent to the U.S. federal minimum wage. Meanwhile, research on collecting high quality annotations suggests using a qualification that requires workers to have previously completed a certain number of tasks. If most requesters who pay fairly require workers to have completed a large number of tasks already then workers need to complete a substantial amount of poorly paid work before they can earn a fair wage. Through analysis of worker discussions and guidance for researchers, we estimate that workers spend approximately 2.25 months of full time effort on poorly paid tasks in order to get the qualifications needed for better paid tasks. We discuss alternatives to this qualification and conduct a study of the correlation between qualifications and work quality on two NLP tasks. We find that it is possible to reduce the burden on workers while still collecting high quality data.
Anthology ID:
2021.acl-short.44
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
343–349
Language:
URL:
https://aclanthology.org/2021.acl-short.44
DOI:
10.18653/v1/2021.acl-short.44
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-short.44.pdf
Optional supplementary material:
 2021.acl-short.44.OptionalSupplementaryMaterial.zip