Yaotian Tao
2024
LinChance-NTU for Unconstrained WMT2024 Literary Translation
Kechen Li
|
Yaotian Tao
|
Hongyi Huang
|
Tianbo Ji
Proceedings of the Ninth Conference on Machine Translation
The rapid growth of deep learning has spurred significant advancements across industries, par- ticularly in machine translation through large language models (LLMs). However, translat- ing literary still presents challenges, including cross-cultural nuances, complex language struc- tures, metaphorical expressions, and cultural differences. To address these issues, this study utilizes the Llama and Phi models using both LoRA and full-parameter techniques, along-side a prompt-based translation system. Full-parameter tuning of the Llama-3-Chinese-8B-Instruct model was unsuccessful due to mem-ory constraints. In terms of the WMT task, the fully fine-tuned Phi 3 model was selected for submission due to its more natural and flu-ent translations. Nonetheless, results showed that LoRA and the prompt-based system sig- nificantly improved the Llama3 model’s perfor- mance, surpassing other models in BLEU and ROUGE evaluations.