CUNI at WMT24 General Translation Task: LLMs, (Q)LoRA, CPO and Model Merging

Miroslav Hrabal, Josef Jon, Martin Popel, Nam Luu, Danil Semin, Ondřej Bojar


Abstract
This paper presents the contributions of Charles University teams to the WMT24 General Translation task (English to Czech, German and Russian, and Czech to Ukrainian), and the WMT24 Translation into Low-Resource Languages of Spain task.Our most elaborate submission, CUNI-MH for en2cs, is the result of fine-tuning Mistral 7B v0.1 for translation using a three-stage process: Supervised fine-tuning using QLoRA, Contrastive Preference Optimization, and merging of model checkpoints. We also describe the CUNI-GA, CUNI-Transformer and CUNI-DocTransformer submissions, which are based on our systems from the previous year.Our en2ru system CUNI-DS uses a similar first stage as CUNI-MH (QLoRA for en2cs) and follows with transferring to en2ru.For en2de (CUNI-NL), we experimented with a LLM-based speech translation system, to translate without the speech input.For the Translation into Low-Resource Languages of Spain task, we performed QLoRA fine-tuning of a large LLM on a small amount of synthetic (backtranslated) data.
Anthology ID:
2024.wmt-1.16
Volume:
Proceedings of the Ninth Conference on Machine Translation
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Barry Haddow, Tom Kocmi, Philipp Koehn, Christof Monz
Venue:
WMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
232–246
Language:
URL:
https://aclanthology.org/2024.wmt-1.16
DOI:
Bibkey:
Cite (ACL):
Miroslav Hrabal, Josef Jon, Martin Popel, Nam Luu, Danil Semin, and Ondřej Bojar. 2024. CUNI at WMT24 General Translation Task: LLMs, (Q)LoRA, CPO and Model Merging. In Proceedings of the Ninth Conference on Machine Translation, pages 232–246, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
CUNI at WMT24 General Translation Task: LLMs, (Q)LoRA, CPO and Model Merging (Hrabal et al., WMT 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.wmt-1.16.pdf