Reducing Disambiguation Biases in NMT by Leveraging Explicit Word Sense Information

Niccolò Campolungo, Tommaso Pasini, Denis Emelin, Roberto Navigli


Abstract
Recent studies have shed some light on a common pitfall of Neural Machine Translation (NMT) models, stemming from their struggle to disambiguate polysemous words without lapsing into their most frequently occurring senses in the training corpus. In this paper, we first provide a novel approach for automatically creating high-precision sense-annotated parallel corpora, and then put forward a specifically tailored fine-tuning strategy for exploiting these sense annotations during training without introducing any additional requirement at inference time. The use of explicit senses proved to be beneficial to reduce the disambiguation bias of a baseline NMT model, while, at the same time, leading our system to attain higher BLEU scores than its vanilla counterpart in 3 language pairs.
Anthology ID:
2022.naacl-main.355
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4824–4838
Language:
URL:
https://aclanthology.org/2022.naacl-main.355
DOI:
10.18653/v1/2022.naacl-main.355
Bibkey:
Cite (ACL):
Niccolò Campolungo, Tommaso Pasini, Denis Emelin, and Roberto Navigli. 2022. Reducing Disambiguation Biases in NMT by Leveraging Explicit Word Sense Information. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4824–4838, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Reducing Disambiguation Biases in NMT by Leveraging Explicit Word Sense Information (Campolungo et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.355.pdf
Software:
 2022.naacl-main.355.software.zip
Video:
 https://aclanthology.org/2022.naacl-main.355.mp4