Communication Efficient Federated Learning for Multilingual Neural Machine Translation with Adapter

Yi Liu, Xiaohan Bi, Lei Li, Sishuo Chen, Wenkai Yang, Xu Sun


Abstract
Federated Multilingual Neural Machine Translation (Fed-MNMT) has emerged as a promising paradigm for institutions with limited language resources. This approach allows multiple institutions to act as clients and train a unified model through model synchronization, rather than collecting sensitive data for centralized training. This significantly reduces the cost of corpus collection and preserves data privacy. However, as pre-trained language models (PLMs) continue to increase in size, the communication cost for transmitting parameters during synchronization has become a training speed bottleneck. In this paper, we propose a communication-efficient Fed-MNMT framework that addresses this issue by keeping PLMs frozen and only transferring lightweight adapter modules between clients. Since different language pairs exhibit substantial discrepancies in data distributions, adapter parameters of clients may conflict with each other. To tackle this, we explore various clustering strategies to group parameters for integration and mitigate the negative effects of conflicting parameters. Experimental results demonstrate that our framework reduces communication cost by over 98% while achieving similar or even better performance compared to competitive baselines. Further analysis reveals that clustering strategies effectively solve the problem of linguistic discrepancy and pruning adapter modules further improves communication efficiency.
Anthology ID:
2023.findings-acl.327
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5315–5328
Language:
URL:
https://aclanthology.org/2023.findings-acl.327
DOI:
10.18653/v1/2023.findings-acl.327
Bibkey:
Cite (ACL):
Yi Liu, Xiaohan Bi, Lei Li, Sishuo Chen, Wenkai Yang, and Xu Sun. 2023. Communication Efficient Federated Learning for Multilingual Neural Machine Translation with Adapter. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5315–5328, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Communication Efficient Federated Learning for Multilingual Neural Machine Translation with Adapter (Liu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.327.pdf