Riemannian Optimization for LoRA on the Stiefel Manifold

JuneYoung Park, Minjae Kang, Seongbae Lee, Haegang Lee, Seongwan Kim, Jaeho Lee


Abstract
While powerful, large language models (LLMs) present significant fine-tuning challenges due to their size. Parameter-efficient fine-tuning (PEFT) methods like LoRA provide solutions, yet suffer from critical optimizer inefficiencies; notably basis redundancy in LoRA’s B matrix when using AdamW, which fundamentally limits performance. We address this by optimizing the B matrix on the Stiefel manifold, imposing explicit orthogonality constraints that achieve near-perfect orthogonality and full effective rank. This geometric approach dramatically enhances parameter efficiency and representational capacity. Our Stiefel optimizer consistently outperforms AdamW across benchmarks with both LoRA and DoRA, demonstrating that geometric constraints are the key to unlocking LoRA’s full potential for effective LLM fine-tuning.
Anthology ID:
2025.findings-emnlp.1143
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
20971–20985
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.1143/
DOI:
Bibkey:
Cite (ACL):
JuneYoung Park, Minjae Kang, Seongbae Lee, Haegang Lee, Seongwan Kim, and Jaeho Lee. 2025. Riemannian Optimization for LoRA on the Stiefel Manifold. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 20971–20985, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Riemannian Optimization for LoRA on the Stiefel Manifold (Park et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.1143.pdf
Checklist:
 2025.findings-emnlp.1143.checklist.pdf