OpenDelta: A Plug-and-play Library for Parameter-efficient Adaptation of Pre-trained Models

Shengding Hu, Ning Ding, Weilin Zhao, Xingtai Lv, Zhen Zhang, Zhiyuan Liu, Maosong Sun


Abstract
The scale of large pre-trained models (PTMs) poses significant challenges in adapting to downstream tasks due to the high optimization overhead and storage costs associated with full-parameter fine-tuning. To address this, many studies explore parameter-efficient tuning methods, also framed as “delta tuning” in Ding et al. (2022), which updates only a small subset of parameters, known as “delta modules”, while keeping the backbone model’s parameters fixed. However, the practicality and flexibility of delta tuning have been limited due to existing implementations that directly modify the code of the backbone PTMs and hard-code specific delta tuning methods for each PTM. In this paper, we present OpenDelta, an open-source library that overcomes these limitations by providing a plug-and-play implementation of various delta tuning methods. Our novel techniques eliminate the need to modify the backbone PTMs’ code, making OpenDelta compatible with different, even novel PTMs. OpenDelta is designed to be simple, modular, and extensible, providing a comprehensive platform for researchers and practitioners to adapt large PTMs efficiently.
Anthology ID:
2023.acl-demo.26
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Danushka Bollegala, Ruihong Huang, Alan Ritter
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
274–281
Language:
URL:
https://aclanthology.org/2023.acl-demo.26
DOI:
10.18653/v1/2023.acl-demo.26
Bibkey:
Cite (ACL):
Shengding Hu, Ning Ding, Weilin Zhao, Xingtai Lv, Zhen Zhang, Zhiyuan Liu, and Maosong Sun. 2023. OpenDelta: A Plug-and-play Library for Parameter-efficient Adaptation of Pre-trained Models. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations), pages 274–281, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
OpenDelta: A Plug-and-play Library for Parameter-efficient Adaptation of Pre-trained Models (Hu et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-demo.26.pdf