Differentially Private Knowledge Distillation via Synthetic Text Generation

James Flemings, Murali Annavaram


Abstract
Large Language models (LLMs) are achieving state-of-the-art performance in many different downstream tasks. However, the increasing urgency of data privacy puts pressure on practitioners to train LLMs with Differential Privacy (DP) on private data. Concurrently, the exponential growth in parameter size of LLMs necessitates model compression before deployment of LLMs on resource-constrained devices or latency-sensitive applications. Differential privacy and model compression generally must trade off utility loss to achieve their objectives. Moreover, simultaneously applying both schemes can compound the utility degradation. To this end, we propose DistilDP: a novel differentially private knowledge distillation algorithm that exploits synthetic data generated by a differentially private teacher LLM. The knowledge of a teacher LLM is transferred onto the student in two ways: one way from the synthetic data itself– the hard labels, and the other way by the output distribution of the teacher evaluated on the synthetic data– the soft labels. Furthermore, if the teacher and student share a similar architectural structure, we can further distill knowledge by aligning the hidden representations between both. Our experimental results demonstrate that DistilDP can substantially improve the utility over existing baselines, at least 9.0 PPL on the Big Patent dataset, with strong privacy parameters, 𝜖=2. These promising results progress privacy-preserving compression of autoregressive LLMs. Our code can be accessed here: https://github.com/james-flemings/dp_compress.
Anthology ID:
2024.findings-acl.769
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12957–12968
Language:
URL:
https://aclanthology.org/2024.findings-acl.769
DOI:
Bibkey:
Cite (ACL):
James Flemings and Murali Annavaram. 2024. Differentially Private Knowledge Distillation via Synthetic Text Generation. In Findings of the Association for Computational Linguistics ACL 2024, pages 12957–12968, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Differentially Private Knowledge Distillation via Synthetic Text Generation (Flemings & Annavaram, Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.769.pdf