Evaluating Parameter-Efficient Finetuning Approaches for Pre-trained Models on the Financial Domain

Isabella Olariu, Cedric Lothritz, Jacques Klein, Tegawendé Bissyandé, Siwen Guo, Shohreh Haddadan


Abstract
Large-scale language models with millions, billions, or trillions of trainable parameters are becoming increasingly popular. However, they risk becoming rapidly over-parameterized and the adaptation cost of fully fine-tuning them increases significantly. Storing them becomes progressively impractical as it requires keeping a separate copy of all the fine-tuned weights for each task. By freezing all pre-trained weights during fine-tuning, parameter-efficient tuning approaches have become an appealing alternative to traditional fine-tuning. The performance of these approaches has been evaluated on common NLP tasks of the GLUE benchmark and shown to match full fine-tuning performance, however, their impact is less researched in domain-specific fields such as finance. This work compares the performance of a set of financial BERT-like models to their fully fine-tuned counterparts by leveraging different parameter-efficient tuning methods. We see that results are comparable to traditional fine-tuning while gaining in time and resource efficiency.
Anthology ID:
2023.findings-emnlp.1035
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15482–15491
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.1035
DOI:
10.18653/v1/2023.findings-emnlp.1035
Bibkey:
Cite (ACL):
Isabella Olariu, Cedric Lothritz, Jacques Klein, Tegawendé Bissyandé, Siwen Guo, and Shohreh Haddadan. 2023. Evaluating Parameter-Efficient Finetuning Approaches for Pre-trained Models on the Financial Domain. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 15482–15491, Singapore. Association for Computational Linguistics.
Cite (Informal):
Evaluating Parameter-Efficient Finetuning Approaches for Pre-trained Models on the Financial Domain (Olariu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.1035.pdf