Preserving Pre-trained Representation Space: On Effectiveness of Prefix-tuning for Large Multi-modal Models Donghoon Kim author Gusang Lee author Kyuhong Shim author Byonghyo Shim author 2024-11 text Findings of the Association for Computational Linguistics: EMNLP 2024 Yaser Al-Onaizan editor Mohit Bansal editor Yun-Nung Chen editor Association for Computational Linguistics Miami, Florida, USA conference publication kim-etal-2024-preserving 10.18653/v1/2024.findings-emnlp.44 https://aclanthology.org/2024.findings-emnlp.44/ 2024-11 797 819