AutoPEFT: Automatic Configuration Search for Parameter-Efficient Fine-Tuning

Han Zhou, Xingchen Wan, Ivan Vulić, Anna Korhonen


Abstract
Large pretrained language models are widely used in downstream NLP tasks via task- specific fine-tuning, but such procedures can be costly. Recently, Parameter-Efficient Fine-Tuning (PEFT) methods have achieved strong task performance while updating much fewer parameters than full model fine-tuning (FFT). However, it is non-trivial to make informed design choices on the PEFT configurations, such as their architecture, the number of tunable parameters, and even the layers in which the PEFT modules are inserted. Consequently, it is highly likely that the current, manually designed configurations are suboptimal in terms of their performance-efficiency trade-off. Inspired by advances in neural architecture search, we propose AutoPEFT for automatic PEFT configuration selection: We first design an expressive configuration search space with multiple representative PEFT modules as building blocks. Using multi-objective Bayesian optimization in a low-cost setup, we then discover a Pareto-optimal set of configurations with strong performance-cost trade-offs across different numbers of parameters that are also highly transferable across different tasks. Empirically, on GLUE and SuperGLUE tasks, we show that AutoPEFT-discovered configurations significantly outperform existing PEFT methods and are on par or better than FFT without incurring substantial training efficiency costs.
Anthology ID:
2024.tacl-1.29
Volume:
Transactions of the Association for Computational Linguistics, Volume 12
Month:
Year:
2024
Address:
Cambridge, MA
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
525–542
Language:
URL:
https://aclanthology.org/2024.tacl-1.29
DOI:
10.1162/tacl_a_00662
Bibkey:
Cite (ACL):
Han Zhou, Xingchen Wan, Ivan Vulić, and Anna Korhonen. 2024. AutoPEFT: Automatic Configuration Search for Parameter-Efficient Fine-Tuning. Transactions of the Association for Computational Linguistics, 12:525–542.
Cite (Informal):
AutoPEFT: Automatic Configuration Search for Parameter-Efficient Fine-Tuning (Zhou et al., TACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.tacl-1.29.pdf