Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models Taiqiang Wu author Chaofan Tao author Jiahao Wang author Runming Yang author Zhe Zhao author Ngai Wong author 2025-01 text Proceedings of the 31st International Conference on Computational Linguistics Owen Rambow editor Leo Wanner editor Marianna Apidianaki editor Hend Al-Khalifa editor Barbara Di Eugenio editor Steven Schockaert editor Association for Computational Linguistics Abu Dhabi, UAE conference publication wu-etal-2025-rethinking https://aclanthology.org/2025.coling-main.383/ 2025-01 5737 5755