Learned Adapters Are Better Than Manually Designed Adapters

Yuming Zhang, Peng Wang, Ming Tan, Wei Zhu


Abstract
Recently, a series of works have looked into further improving the adapter-based tuning by manually designing better adapter architectures. Understandably, these manually designed solutions are sub-optimal. In this work, we propose the Learned Adapter framework to automatically learn the optimal adapter architectures for better task adaptation of pre-trained models (PTMs). First, we construct a unified search space for adapter architecture designs. In terms of the optimization method on the search space, we propose a simple-yet-effective method, GDNAS for better architecture optimization. Extensive experiments show that our Learned Adapter framework can outperform the previous parameter-efficient tuning (PETuning) baselines while tuning comparable or fewer parameters. Moreover: (a) the learned adapter architectures are explainable and transferable across tasks. (b) We demonstrate that our architecture search space design is valid.
Anthology ID:
2023.findings-acl.468
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7420–7437
Language:
URL:
https://aclanthology.org/2023.findings-acl.468
DOI:
10.18653/v1/2023.findings-acl.468
Bibkey:
Cite (ACL):
Yuming Zhang, Peng Wang, Ming Tan, and Wei Zhu. 2023. Learned Adapters Are Better Than Manually Designed Adapters. In Findings of the Association for Computational Linguistics: ACL 2023, pages 7420–7437, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Learned Adapters Are Better Than Manually Designed Adapters (Zhang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.468.pdf
Video:
 https://aclanthology.org/2023.findings-acl.468.mp4