Studying and Mitigating Biases in Sign Language Understanding Models

Katherine Atwell, Danielle Bragg, Malihe Alikhani


Abstract
Ensuring that the benefits of sign language technologies are distributed equitably among all community members is crucial. Thus, it is important to address potential biases and inequities that may arise from the design or use of these resources. Crowd-sourced sign language datasets, such as the ASL Citizen dataset, are great resources for improving accessibility and preserving linguistic diversity, but they must be used thoughtfully to avoid reinforcing existing biases.In this work, we utilize the rich information about participant demographics and lexical features present in the ASL Citizen dataset to study and document the biases that may result from models trained on crowd-sourced sign datasets. Further, we apply several bias mitigation techniques during model training, and find that these techniques reduce performance disparities without decreasing accuracy. With the publication of this work, we release the demographic information about the participants in the ASL Citizen dataset to encourage future bias mitigation work in this space.
Anthology ID:
2024.emnlp-main.17
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
268–283
Language:
URL:
https://aclanthology.org/2024.emnlp-main.17/
DOI:
10.18653/v1/2024.emnlp-main.17
Bibkey:
Cite (ACL):
Katherine Atwell, Danielle Bragg, and Malihe Alikhani. 2024. Studying and Mitigating Biases in Sign Language Understanding Models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 268–283, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Studying and Mitigating Biases in Sign Language Understanding Models (Atwell et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.17.pdf