Task-Agnostic Low-Rank Adapters for Unseen English Dialects

Zedian Xiao, William Held, Yanchen Liu, Diyi Yang


Abstract
Large Language Models (LLMs) are trained on corpora disproportionally weighted in favor of Standard American English. As a result, speakers of other dialects experience significantly more failures when interacting with these technologies. In practice, these speakers often accommodate their speech to be better understood. Our work shares the belief that language technologies should be designed to accommodate the diversity in English dialects and not the other way around. However, prior work on dialect struggle with generalizing to evolving and emerging dialects in a scalable manner. To fill this gap, our method, HyperLoRA, leverages expert linguistic knowledge to enable resource-efficient adaptation via hypernetworks. By disentangling dialect-specific and cross-dialectal information, HyperLoRA improves generalization to unseen dialects in a task-agnostic fashion. Not only is HyperLoRA more scalable in the number of parameters, but it also achieves the best or most competitive performance across 5 dialects in a zero-shot setting. In this way, our approach facilitates access to language technology for billions of English dialect speakers who are traditionally underrepresented.
Anthology ID:
2023.emnlp-main.487
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7857–7870
Language:
URL:
https://aclanthology.org/2023.emnlp-main.487
DOI:
10.18653/v1/2023.emnlp-main.487
Bibkey:
Cite (ACL):
Zedian Xiao, William Held, Yanchen Liu, and Diyi Yang. 2023. Task-Agnostic Low-Rank Adapters for Unseen English Dialects. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 7857–7870, Singapore. Association for Computational Linguistics.
Cite (Informal):
Task-Agnostic Low-Rank Adapters for Unseen English Dialects (Xiao et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.487.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.487.mp4