Soft Layer Selection with Meta-Learning for Zero-Shot Cross-Lingual Transfer

Weijia Xu, Batool Haider, Jason Krone, Saab Mansour


Abstract
Multilingual pre-trained contextual embedding models (Devlin et al., 2019) have achieved impressive performance on zero-shot cross-lingual transfer tasks. Finding the most effective fine-tuning strategy to fine-tune these models on high-resource languages so that it transfers well to the zero-shot languages is a non-trivial task. In this paper, we propose a novel meta-optimizer to soft-select which layers of the pre-trained model to freeze during fine-tuning. We train the meta-optimizer by simulating the zero-shot transfer scenario. Results on cross-lingual natural language inference show that our approach improves over the simple fine-tuning baseline and X-MAML (Nooralahzadeh et al., 2020).
Anthology ID:
2021.metanlp-1.2
Volume:
Proceedings of the 1st Workshop on Meta Learning and Its Applications to Natural Language Processing
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP | MetaNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11–18
Language:
URL:
https://aclanthology.org/2021.metanlp-1.2
DOI:
10.18653/v1/2021.metanlp-1.2
Bibkey:
Cite (ACL):
Weijia Xu, Batool Haider, Jason Krone, and Saab Mansour. 2021. Soft Layer Selection with Meta-Learning for Zero-Shot Cross-Lingual Transfer. In Proceedings of the 1st Workshop on Meta Learning and Its Applications to Natural Language Processing, pages 11–18, Online. Association for Computational Linguistics.
Cite (Informal):
Soft Layer Selection with Meta-Learning for Zero-Shot Cross-Lingual Transfer (Xu et al., MetaNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.metanlp-1.2.pdf
Data
XNLI