Improving Translation Quality Estimation with Bias Mitigation

Hui Huang, Shuangzhi Wu, Kehai Chen, Hui Di, Muyun Yang, Tiejun Zhao


Abstract
State-of-the-art translation Quality Estimation (QE) models are proven to be biased. More specifically, they over-rely on monolingual features while ignoring the bilingual semantic alignment. In this work, we propose a novel method to mitigate the bias of the QE model and improve estimation performance. Our method is based on the contrastive learning between clean and noisy sentence pairs. We first introduce noise to the target side of the parallel sentence pair, forming the negative samples. With the original parallel pairs as the positive sample, the QE model is contrastively trained to distinguish the positive samples from the negative ones. This objective is jointly trained with the regression-style quality estimation, so as to prevent the QE model from overfitting to monolingual features. Experiments on WMT QE evaluation datasets demonstrate that our method improves the estimation performance by a large margin while mitigating the bias.
Anthology ID:
2023.acl-long.121
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2175–2190
Language:
URL:
https://aclanthology.org/2023.acl-long.121
DOI:
10.18653/v1/2023.acl-long.121
Bibkey:
Cite (ACL):
Hui Huang, Shuangzhi Wu, Kehai Chen, Hui Di, Muyun Yang, and Tiejun Zhao. 2023. Improving Translation Quality Estimation with Bias Mitigation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2175–2190, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Improving Translation Quality Estimation with Bias Mitigation (Huang et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.121.pdf
Video:
 https://aclanthology.org/2023.acl-long.121.mp4