Controlled Crowdsourcing for High-Quality QA-SRL Annotation

Paul Roit, Ayal Klein, Daniela Stepanov, Jonathan Mamou, Julian Michael, Gabriel Stanovsky, Luke Zettlemoyer, Ido Dagan


Abstract
Question-answer driven Semantic Role Labeling (QA-SRL) was proposed as an attractive open and natural flavour of SRL, potentially attainable from laymen. Recently, a large-scale crowdsourced QA-SRL corpus and a trained parser were released. Trying to replicate the QA-SRL annotation for new texts, we found that the resulting annotations were lacking in quality, particularly in coverage, making them insufficient for further research and evaluation. In this paper, we present an improved crowdsourcing protocol for complex semantic annotation, involving worker selection and training, and a data consolidation phase. Applying this protocol to QA-SRL yielded high-quality annotation with drastically higher coverage, producing a new gold evaluation dataset. We believe that our annotation protocol and gold standard will facilitate future replicable research of natural semantic annotations.
Anthology ID:
2020.acl-main.626
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7008–7013
Language:
URL:
https://aclanthology.org/2020.acl-main.626
DOI:
10.18653/v1/2020.acl-main.626
Bibkey:
Cite (ACL):
Paul Roit, Ayal Klein, Daniela Stepanov, Jonathan Mamou, Julian Michael, Gabriel Stanovsky, Luke Zettlemoyer, and Ido Dagan. 2020. Controlled Crowdsourcing for High-Quality QA-SRL Annotation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7008–7013, Online. Association for Computational Linguistics.
Cite (Informal):
Controlled Crowdsourcing for High-Quality QA-SRL Annotation (Roit et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.626.pdf
Video:
 http://slideslive.com/38929025
Code
 plroit/qasrl-gs
Data
QA-SRL