Self-Supervised Contrastive Learning with Adversarial Perturbations for Defending Word Substitution-based Attacks

Zhao Meng, Yihan Dong, Mrinmaya Sachan, Roger Wattenhofer


Abstract
In this paper, we present an approach to improve the robustness of BERT language models against word substitution-based adversarial attacks by leveraging adversarial perturbations for self-supervised contrastive learning. We create a word-level adversarial attack generating hard positives on-the-fly as adversarial examples during contrastive learning. In contrast to previous works, our method improves model robustness without using any labeled data. Experimental results show that our method improves robustness of BERT against four different word substitution-based adversarial attacks, and combining our method with adversarial training gives higher robustness than adversarial training alone. As our method improves the robustness of BERT purely with unlabeled data, it opens up the possibility of using large text datasets to train robust language models against word substitution-based adversarial attacks.
Anthology ID:
2022.findings-naacl.8
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
87–101
Language:
URL:
https://aclanthology.org/2022.findings-naacl.8
DOI:
10.18653/v1/2022.findings-naacl.8
Bibkey:
Cite (ACL):
Zhao Meng, Yihan Dong, Mrinmaya Sachan, and Roger Wattenhofer. 2022. Self-Supervised Contrastive Learning with Adversarial Perturbations for Defending Word Substitution-based Attacks. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 87–101, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Self-Supervised Contrastive Learning with Adversarial Perturbations for Defending Word Substitution-based Attacks (Meng et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.8.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.8.mp4
Code
 LotusDYH/ssl_robust
Data
IMDb Movie Reviews