Extracting the Essence and Discarding the Dross: Enhancing Code Generation with Contrastive Execution Feedback

Xuanyu Zhang, Qing Yang


Abstract
Recent advancements have integrated the execution process and feedback into the training of large language models for code generation, demonstrating enhanced model performance. However, current methods amalgamate erroneous code with feedback and the final correct code as target sentences, inadvertently increasing the probability of generating both correct and incorrect code during inference. While multiple iterations of feedback can eventually yield the correct answer, this iterative process is cumbersome and time-consuming for users who prefer immediate accurate results. To address this challenge, we propose ConCoder, a contrastive learning-based code generation model with execution feedback. This approach enables the model to efficiently produce accurate code from the outset while rectifying and optimizing the incorrect code. Furthermore, our training emphasizes learning from the causes of errors, allowing the model to understand and avoid mistakes. Through extensive experiments, ConCoder demonstrates significant improvements in generating accurate code and understanding error correction, paving the way for more reliable code generation models.
Anthology ID:
2025.coling-main.704
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10569–10575
Language:
URL:
https://aclanthology.org/2025.coling-main.704/
DOI:
Bibkey:
Cite (ACL):
Xuanyu Zhang and Qing Yang. 2025. Extracting the Essence and Discarding the Dross: Enhancing Code Generation with Contrastive Execution Feedback. In Proceedings of the 31st International Conference on Computational Linguistics, pages 10569–10575, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Extracting the Essence and Discarding the Dross: Enhancing Code Generation with Contrastive Execution Feedback (Zhang & Yang, COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.704.pdf