Conversational Assistants and Gender Stereotypes: Public Perceptions and Desiderata for Voice Personas

Amanda Cercas Curry, Judy Robertson, Verena Rieser


Abstract
Conversational voice assistants are rapidly developing from purely transactional systems to social companions with “personality”. UNESCO recently stated that the female and submissive personality of current digital assistants gives rise for concern as it reinforces gender stereotypes. In this work, we present results from a participatory design workshop, where we invite people to submit their preferences for a what their ideal persona might look like, both in drawings as well as in a multiple choice questionnaire. We find no clear consensus which suggests that one possible solution is to let people configure/personalise their assistants. We then outline a multi-disciplinary project of how we plan to address the complex question of gender and stereotyping in digital assistants.
Anthology ID:
2020.gebnlp-1.7
Volume:
Proceedings of the Second Workshop on Gender Bias in Natural Language Processing
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Marta R. Costa-jussà, Christian Hardmeier, Will Radford, Kellie Webster
Venue:
GeBNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
72–78
Language:
URL:
https://aclanthology.org/2020.gebnlp-1.7
DOI:
Bibkey:
Cite (ACL):
Amanda Cercas Curry, Judy Robertson, and Verena Rieser. 2020. Conversational Assistants and Gender Stereotypes: Public Perceptions and Desiderata for Voice Personas. In Proceedings of the Second Workshop on Gender Bias in Natural Language Processing, pages 72–78, Barcelona, Spain (Online). Association for Computational Linguistics.
Cite (Informal):
Conversational Assistants and Gender Stereotypes: Public Perceptions and Desiderata for Voice Personas (Cercas Curry et al., GeBNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.gebnlp-1.7.pdf