Boosting Zero-shot Cross-lingual Retrieval by Training on Artificially Code-Switched Data

Robert Litschko, Ekaterina Artemova, Barbara Plank


Abstract
Transferring information retrieval (IR) models from a high-resource language (typically English) to other languages in a zero-shot fashion has become a widely adopted approach. In this work, we show that the effectiveness of zero-shot rankers diminishes when queries and documents are present in different languages. Motivated by this, we propose to train ranking models on artificially code-switched data instead, which we generate by utilizing bilingual lexicons. To this end, we experiment with lexicons induced from (1) cross-lingual word embeddings and (2) parallel Wikipedia page titles. We use the mMARCO dataset to extensively evaluate reranking models on 36 language pairs spanning Monolingual IR (MoIR), Cross-lingual IR (CLIR), and Multilingual IR (MLIR). Our results show that code-switching can yield consistent and substantial gains of 5.1 MRR@10 in CLIR and 3.9 MRR@10 in MLIR, while maintaining stable performance in MoIR. Encouragingly, the gains are especially pronounced for distant languages (up to 2x absolute gain). We further show that our approach is robust towards the ratio of code-switched tokens and also extends to unseen languages. Our results demonstrate that training on code-switched data is a cheap and effective way of generalizing zero-shot rankers for cross-lingual and multilingual retrieval.
Anthology ID:
2023.findings-acl.193
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3096–3108
Language:
URL:
https://aclanthology.org/2023.findings-acl.193
DOI:
10.18653/v1/2023.findings-acl.193
Bibkey:
Cite (ACL):
Robert Litschko, Ekaterina Artemova, and Barbara Plank. 2023. Boosting Zero-shot Cross-lingual Retrieval by Training on Artificially Code-Switched Data. In Findings of the Association for Computational Linguistics: ACL 2023, pages 3096–3108, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Boosting Zero-shot Cross-lingual Retrieval by Training on Artificially Code-Switched Data (Litschko et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.193.pdf
Video:
 https://aclanthology.org/2023.findings-acl.193.mp4