Kamalika Chaudhuri


2022

pdf bib
Sentence-level Privacy for Document Embeddings
Casey Meehan | Khalil Mrini | Kamalika Chaudhuri
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

User language data can contain highly sensitive personal content. As such, it is imperative to offer users a strong and interpretable privacy guarantee when learning from their data. In this work we propose SentDP, pure local differential privacy at the sentence level for a single user document. We propose a novel technique, DeepCandidate, that combines concepts from robust statistics and language modeling to produce high (768) dimensional, general 𝜖-SentDP document embeddings. This guarantees that any single sentence in a document can be substituted with any other sentence while keeping the embedding 𝜖-indistinguishable. Our experiments indicate that these private document embeddings are useful for downstream tasks like sentiment analysis and topic classification and even outperform baseline methods with weaker guarantees like word-level Metric DP.