%0 Conference Proceedings %T Gender Bias in Multilingual Embeddings and Cross-Lingual Transfer %A Zhao, Jieyu %A Mukherjee, Subhabrata %A Hosseini, Saghar %A Chang, Kai-Wei %A Hassan Awadallah, Ahmed %Y Jurafsky, Dan %Y Chai, Joyce %Y Schluter, Natalie %Y Tetreault, Joel %S Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics %D 2020 %8 July %I Association for Computational Linguistics %C Online %F zhao-etal-2020-gender %X Multilingual representations embed words from many languages into a single semantic space such that words with similar meanings are close to each other regardless of the language. These embeddings have been widely used in various settings, such as cross-lingual transfer, where a natural language processing (NLP) model trained on one language is deployed to another language. While the cross-lingual transfer techniques are powerful, they carry gender bias from the source to target languages. In this paper, we study gender bias in multilingual embeddings and how it affects transfer learning for NLP applications. We create a multilingual dataset for bias analysis and propose several ways for quantifying bias in multilingual representations from both the intrinsic and extrinsic perspectives. Experimental results show that the magnitude of bias in the multilingual representations changes differently when we align the embeddings to different target spaces and that the alignment direction can also have an influence on the bias in transfer learning. We further provide recommendations for using the multilingual word representations for downstream tasks. %R 10.18653/v1/2020.acl-main.260 %U https://aclanthology.org/2020.acl-main.260 %U https://doi.org/10.18653/v1/2020.acl-main.260 %P 2896-2907