Virtual Compiler Is All You Need For Assembly Code Search

Zeyu Gao, Hao Wang, Yuanda Wang, Chao Zhang


Abstract
Assembly code search is vital for reducing the burden on reverse engineers, allowing them to quickly identify specific functions using natural language within vast binary programs.Despite its significance, this critical task is impeded by the complexities involved in building high-quality datasets. This paper explores training a Large Language Model (LLM) to emulate a general compiler. By leveraging Ubuntu packages to compile a dataset of 20 billion tokens, we further continue pre-train CodeLlama as a Virtual Compiler (ViC), capable of compiling any source code to assembly code. This approach allows for “virtual” compilation across a wide range of programming languages without the need for a real compiler, preserving semantic equivalency and expanding the possibilities for assembly code dataset construction. Furthermore, we use ViC to construct a sufficiently large dataset for assembly code search. Employing this extensive dataset, we achieve a substantial improvement in assembly code search performance, with our model surpassing the leading baseline by 26%.
Anthology ID:
2024.acl-long.167
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3040–3051
Language:
URL:
https://aclanthology.org/2024.acl-long.167
DOI:
Bibkey:
Cite (ACL):
Zeyu Gao, Hao Wang, Yuanda Wang, and Chao Zhang. 2024. Virtual Compiler Is All You Need For Assembly Code Search. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3040–3051, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Virtual Compiler Is All You Need For Assembly Code Search (Gao et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.167.pdf