Sohyung Kim


2021

pdf bib
Using Confidential Data for Domain Adaptation of Neural Machine Translation
Sohyung Kim | Arianna Bisazza | Fatih Turkmen
Proceedings of the Third Workshop on Privacy in Natural Language Processing

We study the problem of domain adaptation in Neural Machine Translation (NMT) when domain-specific data cannot be shared due to confidentiality or copyright issues. As a first step, we propose to fragment data into phrase pairs and use a random sample to fine-tune a generic NMT model instead of the full sentences. Despite the loss of long segments for the sake of confidentiality protection, we find that NMT quality can considerably benefit from this adaptation, and that further gains can be obtained with a simple tagging technique.