As Little as Possible, as Much as Necessary: Detecting Over- and Undertranslations with Contrastive Conditioning

Jannis Vamvas, Rico Sennrich


Abstract
Omission and addition of content is a typical issue in neural machine translation. We propose a method for detecting such phenomena with off-the-shelf translation models. Using contrastive conditioning, we compare the likelihood of a full sequence under a translation model to the likelihood of its parts, given the corresponding source or target sequence. This allows to pinpoint superfluous words in the translation and untranslated words in the source even in the absence of a reference translation. The accuracy of our method is comparable to a supervised method that requires a custom quality estimation model.
Anthology ID:
2022.acl-short.53
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
490–500
Language:
URL:
https://aclanthology.org/2022.acl-short.53
DOI:
10.18653/v1/2022.acl-short.53
Bibkey:
Cite (ACL):
Jannis Vamvas and Rico Sennrich. 2022. As Little as Possible, as Much as Necessary: Detecting Over- and Undertranslations with Contrastive Conditioning. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 490–500, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
As Little as Possible, as Much as Necessary: Detecting Over- and Undertranslations with Contrastive Conditioning (Vamvas & Sennrich, ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-short.53.pdf
Software:
 2022.acl-short.53.software.zip
Video:
 https://aclanthology.org/2022.acl-short.53.mp4
Code
 zurichnlp/coverage-contrastive-conditioning