Exploiting Word Sense Disambiguation in Large Language Models for Machine Translation

Van-Hien Tran, Raj Dabre, Hour Kaing, Haiyue Song, Hideki Tanaka, Masao Utiyama


Abstract
Machine Translation (MT) has made great strides with the use of Large Language Models (LLMs) and advanced prompting techniques. However, translating sentences with ambiguous words remains challenging, especially when LLMs have limited proficiency in the source language. This paper introduces two methods to enhance MT performance by leveraging the word sense disambiguation capabilities of LLMs. The first method integrates all the available senses of an ambiguous word into the prompting template. The second method uses a pre-trained source language model to predict the correct sense of the ambiguous word, which is then incorporated into the prompting template. Additionally, we propose two prompting template styles for providing word sense information to LLMs. Experiments on the HOLLY dataset demonstrate the effectiveness of our approach in improving MT performance.
Anthology ID:
2025.loreslm-1.10
Volume:
Proceedings of the First Workshop on Language Models for Low-Resource Languages
Month:
January
Year:
2025
Address:
Abu Dhabi, United Arab Emirates
Editors:
Hansi Hettiarachchi, Tharindu Ranasinghe, Paul Rayson, Ruslan Mitkov, Mohamed Gaber, Damith Premasiri, Fiona Anting Tan, Lasitha Uyangodage
Venues:
LoResLM | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
135–144
Language:
URL:
https://aclanthology.org/2025.loreslm-1.10/
DOI:
Bibkey:
Cite (ACL):
Van-Hien Tran, Raj Dabre, Hour Kaing, Haiyue Song, Hideki Tanaka, and Masao Utiyama. 2025. Exploiting Word Sense Disambiguation in Large Language Models for Machine Translation. In Proceedings of the First Workshop on Language Models for Low-Resource Languages, pages 135–144, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Exploiting Word Sense Disambiguation in Large Language Models for Machine Translation (Tran et al., LoResLM 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.loreslm-1.10.pdf