On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning

Marc Tanti, Lonneke van der Plas, Claudia Borg, Albert Gatt


Abstract
Recent work has shown evidence that the knowledge acquired by multilingual BERT (mBERT) has two components: a language-specific and a language-neutral one. This paper analyses the relationship between them, in the context of fine-tuning on two tasks – POS tagging and natural language inference – which require the model to bring to bear different degrees of language-specific knowledge. Visualisations reveal that mBERT loses the ability to cluster representations by language after fine-tuning, a result that is supported by evidence from language identification experiments. However, further experiments on ‘unlearning’ language-specific representations using gradient reversal and iterative adversarial learning are shown not to add further improvement to the language-independent component over and above the effect of fine-tuning. The results presented here suggest that the process of fine-tuning causes a reorganisation of the model’s limited representational capacity, enhancing language-independent representations at the expense of language-specific ones.
Anthology ID:
2021.blackboxnlp-1.15
Volume:
Proceedings of the Fourth BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Jasmijn Bastings, Yonatan Belinkov, Emmanuel Dupoux, Mario Giulianelli, Dieuwke Hupkes, Yuval Pinter, Hassan Sajjad
Venue:
BlackboxNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
214–227
Language:
URL:
https://aclanthology.org/2021.blackboxnlp-1.15
DOI:
10.18653/v1/2021.blackboxnlp-1.15
Bibkey:
Cite (ACL):
Marc Tanti, Lonneke van der Plas, Claudia Borg, and Albert Gatt. 2021. On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning. In Proceedings of the Fourth BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP, pages 214–227, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning (Tanti et al., BlackboxNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.blackboxnlp-1.15.pdf
Video:
 https://aclanthology.org/2021.blackboxnlp-1.15.mp4
Code
 mtanti/mbert-language-specificity
Data
XNLI