Assessing the Portability of Parameter Matrices Trained by Parameter-Efficient Finetuning Methods

Mohammed Mohammed, Anya Belz


Abstract
As the cost of training ever larger language models has grown, so has the interest in reusing previously learnt knowledge. Transfer learning methods have shown how reusing non-task-specific knowledge can help in subsequent task-specific learning.In this paper, we investigate the inverse: porting whole functional modules that encode task-specific knowledge from one model to another. We designed a study comprising 1,440 training/testing runs to test the portability of modules trained by parameter-efficient finetuning (PEFT) techniques, using sentiment analysis as an example task. We test portability in a wide range of scenarios, involving different PEFT techniques and different pretrained host models, among other dimensions. We compare the performance of ported modules with that of equivalent modules trained (i) from scratch, and (ii) from parameters sampled from the same distribution as the ported module.We find that the ported modules far outperform the two alternatives tested, but that there are interesting differences between the four PEFT techniques tested.We conclude that task-specific knowledge in the form of structurally modular sets of parameters as produced by PEFT techniques is highly portable, but that degree of success depends on type of PEFT and on differences between originating and receiving pretrained models.
Anthology ID:
2024.findings-eacl.106
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1548–1556
Language:
URL:
https://aclanthology.org/2024.findings-eacl.106
DOI:
Bibkey:
Cite (ACL):
Mohammed Mohammed and Anya Belz. 2024. Assessing the Portability of Parameter Matrices Trained by Parameter-Efficient Finetuning Methods. In Findings of the Association for Computational Linguistics: EACL 2024, pages 1548–1556, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Assessing the Portability of Parameter Matrices Trained by Parameter-Efficient Finetuning Methods (Mohammed & Belz, Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.106.pdf