Low-Rank Adaptation for Multilingual Summarization: An Empirical Study

Chenxi Whitehouse, Fantine Huot, Jasmijn Bastings, Mostafa Dehghani, Chu-Cheng Lin, Mirella Lapata


Abstract
Although the advancements of pre-trained Large Language Models have significantly accelerated recent progress in NLP, their ever-increasing size poses significant challenges for conventional fine-tuning, especially in memory-intensive tasks. We investigate the potential of Parameter-Efficient Fine-Tuning, focusing on Low-Rank Adaptation (LoRA), in the domain of multilingual summarization, a task that is both challenging (due to typically long inputs), and relatively unexplored. We conduct an extensive study across different data availability scenarios, including high- and low-data settings, and cross-lingual transfer, leveraging models of different sizes. Our findings reveal that LoRA is competitive with full fine-tuning when trained with high quantities of data, and excels in low-data scenarios and cross-lingual transfer. We also study different strategies for few-shot cross-lingual transfer, finding that continued LoRA tuning outperforms full fine-tuning and the dynamic composition of language-specific LoRA modules.
Anthology ID:
2024.findings-naacl.77
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1202–1228
Language:
URL:
https://aclanthology.org/2024.findings-naacl.77
DOI:
Bibkey:
Cite (ACL):
Chenxi Whitehouse, Fantine Huot, Jasmijn Bastings, Mostafa Dehghani, Chu-Cheng Lin, and Mirella Lapata. 2024. Low-Rank Adaptation for Multilingual Summarization: An Empirical Study. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 1202–1228, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Low-Rank Adaptation for Multilingual Summarization: An Empirical Study (Whitehouse et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.77.pdf