ER-AE: Differentially Private Text Generation for Authorship Anonymization

Haohan Bo, Steven H. H. Ding, Benjamin C. M. Fung, Farkhund Iqbal


Abstract
Most of privacy protection studies for textual data focus on removing explicit sensitive identifiers. However, personal writing style, as a strong indicator of the authorship, is often neglected. Recent studies, such as SynTF, have shown promising results on privacy-preserving text mining. However, their anonymization algorithm can only output numeric term vectors which are difficult for the recipients to interpret. We propose a novel text generation model with a two-set exponential mechanism for authorship anonymization. By augmenting the semantic information through a REINFORCE training reward function, the model can generate differentially private text that has a close semantic and similar grammatical structure to the original text while removing personal traits of the writing style. It does not assume any conditioned labels or paralleled text data for training. We evaluate the performance of the proposed model on the real-life peer reviews dataset and the Yelp review dataset. The result suggests that our model outperforms the state-of-the-art on semantic preservation, authorship obfuscation, and stylometric transformation.
Anthology ID:
2021.naacl-main.314
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3997–4007
Language:
URL:
https://aclanthology.org/2021.naacl-main.314
DOI:
10.18653/v1/2021.naacl-main.314
Bibkey:
Cite (ACL):
Haohan Bo, Steven H. H. Ding, Benjamin C. M. Fung, and Farkhund Iqbal. 2021. ER-AE: Differentially Private Text Generation for Authorship Anonymization. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 3997–4007, Online. Association for Computational Linguistics.
Cite (Informal):
ER-AE: Differentially Private Text Generation for Authorship Anonymization (Bo et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.314.pdf
Video:
 https://aclanthology.org/2021.naacl-main.314.mp4
Code
 McGill-DMaS/AuthorshipAnonymization +  additional community code