Answer Quality Aware Aggregation for Extractive QA Crowdsourcing

Peide Zhu, Zhen Wang, Claudia Hauff, Jie Yang, Avishek Anand


Abstract
Quality control is essential for creating extractive question answering (EQA) datasets via crowdsourcing. Aggregation across answers, i.e. word spans within passages annotated, by different crowd workers is one major focus for ensuring its quality. However, crowd workers cannot reach a consensus on a considerable portion of questions. We introduce a simple yet effective answer aggregation method that takes into account the relations among the answer, question, and context passage. We evaluate answer quality from both the view of question answering model to determine how confident the QA model is about each answer and the view of the answer verification model to determine whether the answer is correct. Then we compute aggregation scores with each answer’s quality and its contextual embedding produced by pre-trained language models. The experiments on a large real crowdsourced EQA dataset show that our framework outperforms baselines by around 16% on precision and effectively conduct answer aggregation for extractive QA task.
Anthology ID:
2022.findings-emnlp.457
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6147–6159
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.457
DOI:
10.18653/v1/2022.findings-emnlp.457
Bibkey:
Cite (ACL):
Peide Zhu, Zhen Wang, Claudia Hauff, Jie Yang, and Avishek Anand. 2022. Answer Quality Aware Aggregation for Extractive QA Crowdsourcing. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 6147–6159, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Answer Quality Aware Aggregation for Extractive QA Crowdsourcing (Zhu et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.457.pdf