Mixture-of-Subspaces in Low-Rank Adaptation

Taiqiang Wu, Jiahao Wang, Zhe Zhao, Ngai Wong


Abstract
In this paper, we introduce a subspace-inspired Low-Rank Adaptation (LoRA) method, which is computationally efficient, easy to implement, and readily applicable to large language, multimodal, and diffusion models. Initially, we equivalently decompose the weights of LoRA into two subspaces, and find that simply mixing them can enhance performance. To study such a phenomenon, we revisit it through a fine-grained subspace lens, showing that such modification is equivalent to employing a fixed mixer to fuse the subspaces. To be more flexible, we jointly learn the mixer with the original LoRA weights, and term the method as Mixture-of-Subspaces LoRA (MoSLoRA). MoSLoRA consistently outperforms LoRA on tasks in different modalities, including commonsense reasoning, visual instruction tuning, and subject-driven text-to-image generation, demonstrating its effectiveness and robustness.
Anthology ID:
2024.emnlp-main.450
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7880–7899
Language:
URL:
https://aclanthology.org/2024.emnlp-main.450
DOI:
Bibkey:
Cite (ACL):
Taiqiang Wu, Jiahao Wang, Zhe Zhao, and Ngai Wong. 2024. Mixture-of-Subspaces in Low-Rank Adaptation. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 7880–7899, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Mixture-of-Subspaces in Low-Rank Adaptation (Wu et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.450.pdf
Software:
 2024.emnlp-main.450.software.zip