TABS: Efficient Textual Adversarial Attack for Pre-trained NL Code Model Using Semantic Beam Search

YunSeok Choi, Hyojun Kim, Jee-Hyong Lee


Abstract
As pre-trained models have shown successful performance in program language processing as well as natural language processing, adversarial attacks on these models also attract attention. However, previous works on black-box adversarial attacks generated adversarial examples in a very inefficient way with simple greedy search. They also failed to find out better adversarial examples because it was hard to reduce the search space without performance loss. In this paper, we propose TABS, an efficient beam search black-box adversarial attack method. We adopt beam search to find out better adversarial examples, and contextual semantic filtering to effectively reduce the search space. Contextual semantic filtering reduces the number of candidate adversarial words considering the surrounding context and the semantic similarity. Our proposed method shows good performance in terms of attack success rate, the number of queries, and semantic similarity in attacking models for two tasks: NL code search classification and retrieval tasks.
Anthology ID:
2022.emnlp-main.369
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5490–5498
Language:
URL:
https://aclanthology.org/2022.emnlp-main.369
DOI:
10.18653/v1/2022.emnlp-main.369
Bibkey:
Cite (ACL):
YunSeok Choi, Hyojun Kim, and Jee-Hyong Lee. 2022. TABS: Efficient Textual Adversarial Attack for Pre-trained NL Code Model Using Semantic Beam Search. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 5490–5498, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
TABS: Efficient Textual Adversarial Attack for Pre-trained NL Code Model Using Semantic Beam Search (Choi et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.369.pdf