Analyzing the Effects of Annotator Gender across NLP Tasks

Laura Biester, Vanita Sharma, Ashkan Kazemi, Naihao Deng, Steven Wilson, Rada Mihalcea


Abstract
Recent studies have shown that for subjective annotation tasks, the demographics, lived experiences, and identity of annotators can have a large impact on how items are labeled. We expand on this work, hypothesizing that gender may correlate with differences in annotations for a number of NLP benchmarks, including those that are fairly subjective (e.g., affect in text) and those that are typically considered to be objective (e.g., natural language inference). We develop a robust framework to test for differences in annotation across genders for four benchmark datasets. While our results largely show a lack of statistically significant differences in annotation by males and females for these tasks, the framework can be used to analyze differences in annotation between various other demographic groups in future work. Finally, we note that most datasets are collected without annotator demographics and released only in aggregate form; we call on the community to consider annotator demographics as data is collected, and to release dis-aggregated data to allow for further work analyzing variability among annotators.
Anthology ID:
2022.nlperspectives-1.2
Volume:
Proceedings of the 1st Workshop on Perspectivist Approaches to NLP @LREC2022
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Gavin Abercrombie, Valerio Basile, Sara Tonelli, Verena Rieser, Alexandra Uma
Venue:
NLPerspectives
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
10–19
Language:
URL:
https://aclanthology.org/2022.nlperspectives-1.2
DOI:
Bibkey:
Cite (ACL):
Laura Biester, Vanita Sharma, Ashkan Kazemi, Naihao Deng, Steven Wilson, and Rada Mihalcea. 2022. Analyzing the Effects of Annotator Gender across NLP Tasks. In Proceedings of the 1st Workshop on Perspectivist Approaches to NLP @LREC2022, pages 10–19, Marseille, France. European Language Resources Association.
Cite (Informal):
Analyzing the Effects of Annotator Gender across NLP Tasks (Biester et al., NLPerspectives 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.nlperspectives-1.2.pdf
Code
 michigannlp/analyzing-the-effects-of-annotator-gender-across-nlp-tasks