Chi-Yi Lin
2024
Learning-From-Mistakes Prompting for Indigenous Language Translation
You Cheng Liao
|
Chen-Jui Yu
|
Chi-Yi Lin
|
He-Feng Yun
|
Yen-Hsiang Wang
|
Hsiao-Min Li
|
Yao-Chung Fan
Proceedings of the Seventh Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2024)
Using large language models, this paper presents techniques to improve extremely low-resourced indigenous language translations. Our approaches are grounded in the use of (1) the presence of a datastore consisting of a limited number of parallel translation examples, (2) the inherent capabilities of LLMs like GPT-3.5, and (3) a word-level translation dictionary. We harness the potential of LLMs and in-context learning techniques in such a setting for using LLM as universal translators for extremely low-resourced languages. Our methodology hinges on utilizing LLMs as language compilers for selected language pairs, hypothesizing that they could internalize syntactic structures to facilitate accurate translation. We introduce three techniques: KNN-Prompting with Retrieved Prompting Context, Chain-of-Thought Prompting, and Learning-from-Mistakes Prompting, with the last method addressing past errors. The evaluation results suggest that, even with limited corpora, LLMs, when paired with proper prompting, can effectively translate extremely low-resource languages.
Search
Co-authors
- You Cheng Liao 1
- Chen-Jui Yu 1
- He-Feng Yun 1
- Yen-Hsiang Wang 1
- Hsiao-Min Li 1
- show all...