Improving Zero-Shot Cross-lingual Transfer Between Closely Related Languages by Injecting Character-Level Noise

Noëmi Aepli, Rico Sennrich


Abstract
Cross-lingual transfer between a high-resource language and its dialects or closely related language varieties should be facilitated by their similarity. However, current approaches that operate in the embedding space do not take surface similarity into account. This work presents a simple yet effective strategy to improve cross-lingual transfer between closely related varieties. We propose to augment the data of the high-resource source language with character-level noise to make the model more robust towards spelling variations. Our strategy shows consistent improvements over several languages and tasks: Zero-shot transfer of POS tagging and topic identification between language varieties from the Finnic, West and North Germanic, and Western Romance language branches. Our work provides evidence for the usefulness of simple surface-level noise in improving transfer between language varieties.
Anthology ID:
2022.findings-acl.321
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4074–4083
Language:
URL:
https://aclanthology.org/2022.findings-acl.321
DOI:
10.18653/v1/2022.findings-acl.321
Bibkey:
Cite (ACL):
Noëmi Aepli and Rico Sennrich. 2022. Improving Zero-Shot Cross-lingual Transfer Between Closely Related Languages by Injecting Character-Level Noise. In Findings of the Association for Computational Linguistics: ACL 2022, pages 4074–4083, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Improving Zero-Shot Cross-lingual Transfer Between Closely Related Languages by Injecting Character-Level Noise (Aepli & Sennrich, Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.321.pdf
Video:
 https://aclanthology.org/2022.findings-acl.321.mp4
Data
Universal Dependencies