Rethinking Word-Level Auto-Completion in Computer-Aided Translation

Xingyu Chen, Lemao Liu, Guoping Huang, Zhirui Zhang, Mingming Yang, Shuming Shi, Rui Wang


Abstract
Word-level auto-completion (WLAC) plays a crucial role in Computer-Assisted Translation. While previous studies have primarily focused on designing complex model architectures, this paper takes a different perspective by rethinking the fundamental question: what kind of words are good auto-completions? We introduce a measurable criterion to address this question and discover that existing WLAC models often fail to meet this criterion. Building upon this observation, we propose an effective approach to enhance WLAC performance by promoting adherence to the criterion. Notably, the proposed approach is general and can be applied to various encoder-based architectures. Through extensive experiments, we demonstrate that our approach outperforms the top-performing system submitted to the WLAC shared tasks in WMT2022, while utilizing significantly smaller model sizes.
Anthology ID:
2023.emnlp-main.952
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15405–15415
Language:
URL:
https://aclanthology.org/2023.emnlp-main.952
DOI:
10.18653/v1/2023.emnlp-main.952
Bibkey:
Cite (ACL):
Xingyu Chen, Lemao Liu, Guoping Huang, Zhirui Zhang, Mingming Yang, Shuming Shi, and Rui Wang. 2023. Rethinking Word-Level Auto-Completion in Computer-Aided Translation. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 15405–15415, Singapore. Association for Computational Linguistics.
Cite (Informal):
Rethinking Word-Level Auto-Completion in Computer-Aided Translation (Chen et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.952.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.952.mp4