ReinforceBug: A Framework to Generate Adversarial Textual Examples

Bushra Sabir, Muhammad Ali Babar, Raj Gaire


Abstract
Adversarial Examples (AEs) generated by perturbingining examples are useful in improving the robustness of Deep Learning (DL) based models. Most prior works generate AEs that are either unconscionable due to lexical errors or semantically and functionally deviant from original examples. In this paper, we present ReinforceBug, a reinforcement learning framework, that learns a policy that is transferable on unseen datasets and generates utility-preserving and transferable (on other models) AEs. Our experiments show that ReinforceBug is on average 10% more successful as compared to the state-of the-art attack TextFooler. Moreover, the target models have on average 73.64% confidence in wrong prediction, the generated AEs preserve the functional equivalence and semantic similarity (83.38%) to their original counterparts, and are transferable on other models with an average success rate of 46%
Anthology ID:
2021.naacl-main.477
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5954–5964
Language:
URL:
https://aclanthology.org/2021.naacl-main.477
DOI:
10.18653/v1/2021.naacl-main.477
Bibkey:
Cite (ACL):
Bushra Sabir, Muhammad Ali Babar, and Raj Gaire. 2021. ReinforceBug: A Framework to Generate Adversarial Textual Examples. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5954–5964, Online. Association for Computational Linguistics.
Cite (Informal):
ReinforceBug: A Framework to Generate Adversarial Textual Examples (Sabir et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.477.pdf
Optional supplementary code:
 2021.naacl-main.477.OptionalSupplementaryCode.zip
Optional supplementary data:
 2021.naacl-main.477.OptionalSupplementaryData.zip
Video:
 https://aclanthology.org/2021.naacl-main.477.mp4