KnowComp Submission for WMT23 Word-Level AutoCompletion Task

Yi Wu, Haochen Shi, Weiqi Wang, Yangqiu Song


Abstract
The NLP community has recently witnessed the success of Large Language Models (LLMs) across various Natural Language Processing (NLP) tasks. However, the potential of LLMs for word-level auto-completion in a multilingual context has not been thoroughly explored yet. To address this gap and benchmark the performance of LLMs, we propose an LLM-based system for the WMT23 Word-Level Auto-Completion (WLAC) task. Our system utilizes ChatGPT to represent LLMs and evaluates its performance in three translation directions: Chinese-English, German-English, and English-German. We also study the task under zero-shot and few-shot settings to assess the potential benefits of incorporating exemplars from the training set in guiding the LLM to perform the task. The results of our experiments show that, on average, our system attains a 29.8% accuracy on the test set. Further analyses reveal that LLMs struggle with WLAC in the zero-shot setting, but performance significantly improves with the help of additional exemplars, though some common errors still appear frequently. These findings have important implications for incorporating LLMs into computer-aided translation systems, as they can potentially enhance the quality of translations. Our codes for evaluation are available at https://github.com/ethanyiwu/WLAC.
Anthology ID:
2023.wmt-1.79
Volume:
Proceedings of the Eighth Conference on Machine Translation
Month:
December
Year:
2023
Address:
Singapore
Editors:
Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
882–889
Language:
URL:
https://aclanthology.org/2023.wmt-1.79
DOI:
10.18653/v1/2023.wmt-1.79
Bibkey:
Cite (ACL):
Yi Wu, Haochen Shi, Weiqi Wang, and Yangqiu Song. 2023. KnowComp Submission for WMT23 Word-Level AutoCompletion Task. In Proceedings of the Eighth Conference on Machine Translation, pages 882–889, Singapore. Association for Computational Linguistics.
Cite (Informal):
KnowComp Submission for WMT23 Word-Level AutoCompletion Task (Wu et al., WMT 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.wmt-1.79.pdf