Dynamics of Instruction Fine-Tuning for Chinese Large Language Models

Chiyu Song, Zhanchao Zhou, Jianhao Yan, Yuejiao Fei, Zhenzhong Lan, Yue Zhang


Abstract
Instruction tuning is a burgeoning method to elicit the general intelligence of Large Language Models (LLMs). While numerous studies have examined the impact of factors such as data volume and model size on English models, the scaling properties of instruction tuning in other languages remain largely unexplored. In this work, we systematically investigate the effects of data quantity, model size, and data construction methods on instruction tuning for Chinese LLMs. We utilize a newly curated dataset, DoIT, which includes over 40,000 high-quality instruction instances covering ten underlying abilities, such as creative writing, code generation, and logical reasoning. Our experiments, conducted on models ranging from 7b to 33b parameters, yield three key findings: (i) While these factors directly affect overall model performance, some abilities are more responsive to scaling, whereas others demonstrate significant resistance. (ii) The scaling sensitivity of different abilities to these factors can be explained by two features: Complexity and Transference. (iii) By tailoring training strategies to their varying sensitivities, specific abilities can be efficiently learned, enhancing performance on two public benchmarks.
Anthology ID:
2025.coling-main.689
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10345–10366
Language:
URL:
https://aclanthology.org/2025.coling-main.689/
DOI:
Bibkey:
Cite (ACL):
Chiyu Song, Zhanchao Zhou, Jianhao Yan, Yuejiao Fei, Zhenzhong Lan, and Yue Zhang. 2025. Dynamics of Instruction Fine-Tuning for Chinese Large Language Models. In Proceedings of the 31st International Conference on Computational Linguistics, pages 10345–10366, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Dynamics of Instruction Fine-Tuning for Chinese Large Language Models (Song et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.689.pdf