Keep it Private: Unsupervised Privatization of Online Text

Calvin Bao, Marine Carpuat


Abstract
Authorship obfuscation techniques hold the promise of helping people protect their privacy in online communications by automatically rewriting text to hide the identity of the original author. However, obfuscation has been evaluated in narrow settings in the NLP literature and has primarily been addressed with superficial edit operations that can lead to unnatural outputs. In this work, we introduce an automatic text privatization framework that fine-tunes a large language model via reinforcement learning to produce rewrites that balance soundness, sense, and privacy. We evaluate it extensively on a large-scale test set of English Reddit posts by 68k authors composed of short-medium length texts. We study how the performance changes among evaluative conditions including authorial profile length and authorship detection strategy. Our method maintains high text quality according to both automated metrics and human evaluation, and successfully evades several automated authorship attacks.
Anthology ID:
2024.naacl-long.480
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8670–8685
Language:
URL:
https://aclanthology.org/2024.naacl-long.480
DOI:
Bibkey:
Cite (ACL):
Calvin Bao and Marine Carpuat. 2024. Keep it Private: Unsupervised Privatization of Online Text. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 8670–8685, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Keep it Private: Unsupervised Privatization of Online Text (Bao & Carpuat, NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.480.pdf
Copyright:
 2024.naacl-long.480.copyright.pdf