Contrastive Token Learning with Similarity Decay for Repetition Suppression in Machine Translation

Huangyu Dai, Ben Chen, Kaidi Chen, Ying Han, Zihan Liang, Wen Jiang


Abstract
For crosslingual conversation and trade, Neural Machine Translation (NMT) is pivotal yet faces persistent challenges with monotony and repetition in generated content. Traditional solutions that rely on penalizing text redundancy or token reoccurrence have shown limited efficacy, particularly for lengthy article and e-commerce descriptions with inherent redundancy, even with the advent of Large Language Models (LLMs). This paper investigates the underlying causes of textual repetition through the lens of information entropy, attributing the phenomenon to the elevated uncertainty within the input text. To address this, a novel algorithm named Contrastive Token Learning with Similarity Decay (CTSD) is introduced, which modulates the suppression of tokens dynamically, informed by varying attention weights and inter-token distances. Furthermore, an e-commerce dataset comprised of title texts of online real items is compiled and released susceptible to hallucination translations to benchmark the algorithm. Extensive evaluations demonstrate that CTSD significantly outperforms existing approaches in precision and generalizability. Additional online A/B testing underscores its practical value, showing marked improvements in user engagement and conversion. Notably, this method has been implemented with full traffic on eight multilingual sites of alibaba.com, the largest B2B e-commerce platform in the world.
Anthology ID:
2024.findings-emnlp.185
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3247–3261
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.185
DOI:
Bibkey:
Cite (ACL):
Huangyu Dai, Ben Chen, Kaidi Chen, Ying Han, Zihan Liang, and Wen Jiang. 2024. Contrastive Token Learning with Similarity Decay for Repetition Suppression in Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 3247–3261, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Contrastive Token Learning with Similarity Decay for Repetition Suppression in Machine Translation (Dai et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.185.pdf