Investigating Failures of Automatic Translationin the Case of Unambiguous Gender

Adithya Renduchintala, Adina Williams


Abstract
Transformer-based models are the modern work horses for neural machine translation (NMT), reaching state of the art across several benchmarks. Despite their impressive accuracy, we observe a systemic and rudimentary class of errors made by current state-of-the-art NMT models with regards to translating from a language that doesn’t mark gender on nouns into others that do. We find that even when the surrounding context provides unambiguous evidence of the appropriate grammatical gender marking, no tested model was able to accurately gender occupation nouns systematically. We release an evaluation scheme and dataset for measuring the ability of NMT models to translate gender morphology correctly in unambiguous contexts across syntactically diverse sentences. Our dataset translates from an English source into 20 languages from several different language families. With the availability of this dataset, our hope is that the NMT community can iterate on solutions for this class of especially egregious errors.
Anthology ID:
2022.acl-long.243
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3454–3469
Language:
URL:
https://aclanthology.org/2022.acl-long.243
DOI:
10.18653/v1/2022.acl-long.243
Bibkey:
Cite (ACL):
Adithya Renduchintala and Adina Williams. 2022. Investigating Failures of Automatic Translationin the Case of Unambiguous Gender. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3454–3469, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Investigating Failures of Automatic Translationin the Case of Unambiguous Gender (Renduchintala & Williams, ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.243.pdf
Software:
 2022.acl-long.243.software.zip