DP-NMT: Scalable Differentially Private Machine Translation

Timour Igamberdiev, Doan Nam Long Vu, Felix Kuennecke, Zhuo Yu, Jannik Holmer, Ivan Habernal


Abstract
Neural machine translation (NMT) is a widely popular text generation task, yet there is a considerable research gap in the development of privacy-preserving NMT models, despite significant data privacy concerns for NMT systems. Differentially private stochastic gradient descent (DP-SGD) is a popular method for training machine learning models with concrete privacy guarantees; however, the implementation specifics of training a model with DP-SGD are not always clarified in existing models, with differing software libraries used and code bases not always being public, leading to reproducibility issues. To tackle this, we introduce DP-NMT, an open-source framework for carrying out research on privacy-preserving NMT with DP-SGD, bringing together numerous models, datasets, and evaluation metrics in one systematic software package. Our goal is to provide a platform for researchers to advance the development of privacy-preserving NMT systems, keeping the specific details of the DP-SGD algorithm transparent and intuitive to implement. We run a set of experiments on datasets from both general and privacy-related domains to demonstrate our framework in use. We make our framework publicly available and welcome feedback from the community.
Anthology ID:
2024.eacl-demo.11
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations
Month:
March
Year:
2024
Address:
St. Julians, Malta
Editors:
Nikolaos Aletras, Orphee De Clercq
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
94–105
Language:
URL:
https://aclanthology.org/2024.eacl-demo.11
DOI:
Bibkey:
Cite (ACL):
Timour Igamberdiev, Doan Nam Long Vu, Felix Kuennecke, Zhuo Yu, Jannik Holmer, and Ivan Habernal. 2024. DP-NMT: Scalable Differentially Private Machine Translation. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations, pages 94–105, St. Julians, Malta. Association for Computational Linguistics.
Cite (Informal):
DP-NMT: Scalable Differentially Private Machine Translation (Igamberdiev et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-demo.11.pdf