Parameter-Efficient Language Model Tuning with Active Learning in Low-Resource Settings

Josip Jukić, Jan Snajder


Abstract
Pre-trained language models (PLMs) have ignited a surge in demand for effective fine-tuning techniques, particularly in low-resource domains and languages. Active learning (AL), a set of algorithms designed to decrease labeling costs by minimizing label complexity, has shown promise in confronting the labeling bottleneck. In parallel, adapter modules designed for parameter-efficient fine-tuning (PEFT) have demonstrated notable potential in low-resource settings. However, the interplay between AL and adapter-based PEFT remains unexplored. We present an empirical study of PEFT behavior with AL in low-resource settings for text classification tasks. Our findings affirm the superiority of PEFT over full-fine tuning (FFT) in low-resource settings and demonstrate that this advantage persists in AL setups. We further examine the properties of PEFT and FFT through the lens of forgetting dynamics and instance-level representations, where we find that PEFT yields more stable representations of early and middle layers compared to FFT. Our research underscores the synergistic potential of AL and PEFT in low-resource settings, paving the way for advancements in efficient and effective fine-tuning.
Anthology ID:
2023.emnlp-main.307
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5061–5074
Language:
URL:
https://aclanthology.org/2023.emnlp-main.307
DOI:
10.18653/v1/2023.emnlp-main.307
Bibkey:
Cite (ACL):
Josip Jukić and Jan Snajder. 2023. Parameter-Efficient Language Model Tuning with Active Learning in Low-Resource Settings. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 5061–5074, Singapore. Association for Computational Linguistics.
Cite (Informal):
Parameter-Efficient Language Model Tuning with Active Learning in Low-Resource Settings (Jukić & Snajder, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.307.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.307.mp4