Using Item Response Theory to Measure Gender and Racial Bias of a BERT-based Automated English Speech Assessment System

Alexander Kwako, Yixin Wan, Jieyu Zhao, Kai-Wei Chang, Li Cai, Mark Hansen


Abstract
Recent advances in natural language processing and transformer-based models have made it easier to implement accurate, automated English speech assessments. Yet, without careful examination, applications of these models may exacerbate social prejudices based on gender and race. This study addresses the need to examine potential biases of transformer-based models in the context of automated English speech assessment. For this purpose, we developed a BERT-based automated speech assessment system and investigated gender and racial bias of examinees’ automated scores. Gender and racial bias was measured by examining differential item functioning (DIF) using an item response theory framework. Preliminary results, which focused on a single verbal-response item, showed no statistically significant DIF based on gender or race for automated scores.
Anthology ID:
2022.bea-1.1
Volume:
Proceedings of the 17th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2022)
Month:
July
Year:
2022
Address:
Seattle, Washington
Editors:
Ekaterina Kochmar, Jill Burstein, Andrea Horbach, Ronja Laarmann-Quante, Nitin Madnani, Anaïs Tack, Victoria Yaneva, Zheng Yuan, Torsten Zesch
Venue:
BEA
SIG:
SIGEDU
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–7
Language:
URL:
https://aclanthology.org/2022.bea-1.1
DOI:
10.18653/v1/2022.bea-1.1
Bibkey:
Cite (ACL):
Alexander Kwako, Yixin Wan, Jieyu Zhao, Kai-Wei Chang, Li Cai, and Mark Hansen. 2022. Using Item Response Theory to Measure Gender and Racial Bias of a BERT-based Automated English Speech Assessment System. In Proceedings of the 17th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2022), pages 1–7, Seattle, Washington. Association for Computational Linguistics.
Cite (Informal):
Using Item Response Theory to Measure Gender and Racial Bias of a BERT-based Automated English Speech Assessment System (Kwako et al., BEA 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.bea-1.1.pdf
Video:
 https://aclanthology.org/2022.bea-1.1.mp4