On the Nature of BERT: Correlating Fine-Tuning and Linguistic Competence

Federica Merendi, Felice Dell’Orletta, Giulia Venturi


Abstract
Several studies in the literature on the interpretation of Neural Language Models (NLM) focus on the linguistic generalization abilities of pre-trained models. However, little attention is paid to how the linguistic knowledge of the models changes during the fine-tuning steps. In this paper, we contribute to this line of research by showing to what extent a wide range of linguistic phenomena are forgotten across 50 epochs of fine-tuning, and how the preserved linguistic knowledge is correlated with the resolution of the fine-tuning task. To this end, we considered a quite understudied task where linguistic information plays the main role, i.e. the prediction of the evolution of written language competence of native language learners. In addition, we investigate whether it is possible to predict the fine-tuned NLM accuracy across the 50 epochs solely relying on the assessed linguistic competence. Our results are encouraging and show a high relationship between the model’s linguistic competence and its ability to solve a linguistically-based downstream task.
Anthology ID:
2022.coling-1.275
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3109–3119
Language:
URL:
https://aclanthology.org/2022.coling-1.275
DOI:
Bibkey:
Cite (ACL):
Federica Merendi, Felice Dell’Orletta, and Giulia Venturi. 2022. On the Nature of BERT: Correlating Fine-Tuning and Linguistic Competence. In Proceedings of the 29th International Conference on Computational Linguistics, pages 3109–3119, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
On the Nature of BERT: Correlating Fine-Tuning and Linguistic Competence (Merendi et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.275.pdf