LoRE-Merging: Exploring Low-Rank Estimation For Large Language Model Merging

Zehua Liu, Han Wu, Yuxuan Yao, Xiaojin Fu, Ruifeng She, Xiongwei Han, Tao Zhong, Mingxuan Yuan


Abstract
While most current approaches rely on further training techniques, such as fine-tuning or reinforcement learning, to enhance model capacities, model merging stands out for its ability of improving models without requiring any additional training. In this paper, we propose a unified framework for model merging based on low-rank estimation of task vectors without the need for access to the base model, named LoRE-Merging. Our approach is motivated by the observation that task vectors from fine-tuned models frequently exhibit a limited number of dominant singular values, making low-rank estimations less prone to interference. We implement the method by formulating the merging problem as an optimization problem. Extensive empirical experiments demonstrate the effectiveness of our framework in mitigating interference and preserving task-specific information, thereby advancing the state-of-the-art performance in model merging techniques.
Anthology ID:
2025.findings-emnlp.1195
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21919–21926
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.1195/
DOI:
Bibkey:
Cite (ACL):
Zehua Liu, Han Wu, Yuxuan Yao, Xiaojin Fu, Ruifeng She, Xiongwei Han, Tao Zhong, and Mingxuan Yuan. 2025. LoRE-Merging: Exploring Low-Rank Estimation For Large Language Model Merging. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 21919–21926, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
LoRE-Merging: Exploring Low-Rank Estimation For Large Language Model Merging (Liu et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.1195.pdf
Checklist:
 2025.findings-emnlp.1195.checklist.pdf