Improving Neural Machine Translation by Multi-Knowledge Integration with Prompting

Ke Wang, Jun Xie, Yuqi Zhang, Yu Zhao


Abstract
Improving neural machine translation (NMT) systems with prompting has achieved significant progress in recent years. In this work, we focus on how to integrate multi-knowledge, multiple types of knowledge, into NMT models to enhance the performance with prompting. We propose a unified framework, which can integrate effectively multiple types of knowledge including sentences, terminologies/phrases and translation templates into NMT models. We utilize multiple types of knowledge as prefix-prompts of input for the encoder and decoder of NMT models to guide the translation process. The approach requires no changes to the model architecture and effectively adapts to domain-specific translation without retraining. The experiments on English-Chinese and English-German translation demonstrate that our approach significantly outperform strong baselines, achieving high translation quality and terminology match accuracy.
Anthology ID:
2023.findings-emnlp.333
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5000–5010
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.333
DOI:
10.18653/v1/2023.findings-emnlp.333
Bibkey:
Cite (ACL):
Ke Wang, Jun Xie, Yuqi Zhang, and Yu Zhao. 2023. Improving Neural Machine Translation by Multi-Knowledge Integration with Prompting. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 5000–5010, Singapore. Association for Computational Linguistics.
Cite (Informal):
Improving Neural Machine Translation by Multi-Knowledge Integration with Prompting (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.333.pdf