Bradley Mcdanel
Also published as: Bradley McDanel
2024
Revelata at the FinLLM Challenge Task: Improving Financial Text Summarization by Restricted Prompt Engineering and Fine-tuning
Ken Kawamura
|
Zeqian Li
|
Chit-Kwan Lin
|
Bradley McDanel
Proceedings of the Eighth Financial Technology and Natural Language Processing and the 1st Agent AI for Scenario Planning
2023
ChatGPT as a Java Decompiler
Bradley Mcdanel
|
Zhanhao Liu
Proceedings of the Third Workshop on Natural Language Generation, Evaluation, and Metrics (GEM)
We propose a novel approach using instruction-tuned large language models (LLMs), such as ChatGPT, to automatically decompile entire Java classes. Our method relies only on a textual representation of the Java bytecode and corresponding unit tests generated from the bytecode. While no additional domain knowledge or fine-tuning is performed, we provide a single training example of this decompilation process in the model’s prompt. To overcome both compilation errors and test failures, we use an iterative prompting approach. We find that ChatGPT-4 is able to generate more human-readable output than existing software-based decompilers while achieving slightly lower pass rates on unit tests. Source code and datasets are available at https://github.com/BradMcDanel/gpt-java-decompiler.
Search