Translation Transformers Rediscover Inherent Data Domains

Maksym Del, Elizaveta Korotkova, Mark Fishel


Abstract
Many works proposed methods to improve the performance of Neural Machine Translation (NMT) models in a domain/multi-domain adaptation scenario. However, an understanding of how NMT baselines represent text domain information internally is still lacking. Here we analyze the sentence representations learned by NMT Transformers and show that these explicitly include the information on text domains, even after only seeing the input sentences without domains labels. Furthermore, we show that this internal information is enough to cluster sentences by their underlying domains without supervision. We show that NMT models produce clusters better aligned to the actual domains compared to pre-trained language models (LMs). Notably, when computed on document-level, NMT cluster-to-domain correspondence nears 100%. We use these findings together with an approach to NMT domain adaptation using automatically extracted domains. Whereas previous work relied on external LMs for text clustering, we propose re-using the NMT model as a source of unsupervised clusters. We perform an extensive experimental study comparing two approaches across two data scenarios, three language pairs, and both sentence-level and document-level clustering, showing equal or significantly superior performance compared to LMs.
Anthology ID:
2021.wmt-1.65
Volume:
Proceedings of the Sixth Conference on Machine Translation
Month:
November
Year:
2021
Address:
Online
Editors:
Loic Barrault, Ondrej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussa, Christian Federmann, Mark Fishel, Alexander Fraser, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Tom Kocmi, Andre Martins, Makoto Morishita, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
599–613
Language:
URL:
https://aclanthology.org/2021.wmt-1.65
DOI:
Bibkey:
Cite (ACL):
Maksym Del, Elizaveta Korotkova, and Mark Fishel. 2021. Translation Transformers Rediscover Inherent Data Domains. In Proceedings of the Sixth Conference on Machine Translation, pages 599–613, Online. Association for Computational Linguistics.
Cite (Informal):
Translation Transformers Rediscover Inherent Data Domains (Del et al., WMT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.wmt-1.65.pdf
Video:
 https://aclanthology.org/2021.wmt-1.65.mp4
Code
 tartunlp/inherent-domains-wmt21
Data
OpenSubtitles