PDC & DM-SFT: A Road for LLM SQL Bug-Fix Enhancing

Yiwen Duan, Yonghong Yu, Xiaoming Zhao, Yichang Wu, Wenbo Liu


Abstract
Code Large Language Models (Code LLMs), such as Code llama and DeepSeek-Coder, have demonstrated exceptional performance in the code generation tasks. However, most existing models focus on the abilities of generating correct code, but often struggle with bug repair. We introduce a suit of methods to enhance LLM’s SQL bug-fixing abilities. The methods are mainly consisted of two parts: A Progressive Dataset Construction (PDC) from scratch and Dynamic Mask Supervised Fine-tuning (DM-SFT). PDC proposes two data expansion methods from the perspectives of breadth first and depth first respectively. DM-SFT introduces an efficient bug-fixing supervised learning approach, which effectively reduce the total training steps and mitigate the “disorientation” in SQL code bug-fixing training. In our evaluation, the code LLM models trained with two methods have exceeds all current best performing model which size is much larger.
Anthology ID:
2025.coling-industry.7
Volume:
Proceedings of the 31st International Conference on Computational Linguistics: Industry Track
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert, Kareem Darwish, Apoorv Agarwal
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
76–90
Language:
URL:
https://aclanthology.org/2025.coling-industry.7/
DOI:
Bibkey:
Cite (ACL):
Yiwen Duan, Yonghong Yu, Xiaoming Zhao, Yichang Wu, and Wenbo Liu. 2025. PDC & DM-SFT: A Road for LLM SQL Bug-Fix Enhancing. In Proceedings of the 31st International Conference on Computational Linguistics: Industry Track, pages 76–90, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
PDC & DM-SFT: A Road for LLM SQL Bug-Fix Enhancing (Duan et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-industry.7.pdf