Practical Takes on Federated Learning with Pretrained Language Models

Ankur Agarwal, Mehdi Rezagholizadeh, Prasanna Parthasarathi


Abstract
Real-world applications of language models entail data privacy constraints when learning from diverse data domains. Federated learning with pretrained language models for language tasks has been gaining attention lately but there are definite confounders that warrants a careful study. Specifically, understanding the limits of federated NLP applications through varying the effects of different aspects (such as data heterogeneity, the trade-off between training time and performance, the effect of different data, and client distributions and sensitivity of the shared model to learning local distributions) is necessary to evaluate whether language models indeed learn to generalize by adapting to the different domains. Towards that, we elaborate different hypotheses over the components in federated NLP architectures and study them in detail with relevant experiments over three tasks: Stanford Sentiment Treebank-2, OntoNotes-5.0 and GigaWord. The experiments with different Transformer inductive biases on the variety of tasks provide a glimpse at the understanding of federated learning at NLP tasks. Specifically, the analysis suggests that regularization due to the ensembling effect may be masquerading as domain adaptation of federated learning in NLP with pre-trained language models.
Anthology ID:
2023.findings-eacl.34
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
454–471
Language:
URL:
https://aclanthology.org/2023.findings-eacl.34
DOI:
10.18653/v1/2023.findings-eacl.34
Bibkey:
Cite (ACL):
Ankur Agarwal, Mehdi Rezagholizadeh, and Prasanna Parthasarathi. 2023. Practical Takes on Federated Learning with Pretrained Language Models. In Findings of the Association for Computational Linguistics: EACL 2023, pages 454–471, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Practical Takes on Federated Learning with Pretrained Language Models (Agarwal et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.34.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.34.mp4