Domain Mismatch Doesn’t Always Prevent Cross-lingual Transfer Learning

Daniel Edmiston, Phillip Keung, Noah A. Smith


Abstract
Cross-lingual transfer learning without labeled target language data or parallel text has been surprisingly effective in zero-shot cross-lingual classification, question answering, unsupervised machine translation, etc. However, some recent publications have claimed that domain mismatch prevents cross-lingual transfer, and their results show that unsupervised bilingual lexicon induction (UBLI) and unsupervised neural machine translation (UNMT) do not work well when the underlying monolingual corpora come from different domains (e.g., French text from Wikipedia but English text from UN proceedings). In this work, we show how a simple initialization regimen can overcome much of the effect of domain mismatch in cross-lingual transfer. We pre-train word and contextual embeddings on the concatenated domain-mismatched corpora, and use these as initializations for three tasks: MUSE UBLI, UN Parallel UNMT, and the SemEval 2017 cross-lingual word similarity task. In all cases, our results challenge the conclusions of prior work by showing that proper initialization can recover a large portion of the losses incurred by domain mismatch.
Anthology ID:
2022.lrec-1.94
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
892–899
Language:
URL:
https://aclanthology.org/2022.lrec-1.94
DOI:
Bibkey:
Cite (ACL):
Daniel Edmiston, Phillip Keung, and Noah A. Smith. 2022. Domain Mismatch Doesn’t Always Prevent Cross-lingual Transfer Learning. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 892–899, Marseille, France. European Language Resources Association.
Cite (Informal):
Domain Mismatch Doesn’t Always Prevent Cross-lingual Transfer Learning (Edmiston et al., LREC 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.lrec-1.94.pdf
Data
Europarl