Distilling Rule-based Knowledge into Large Language Models

Wenkai Yang, Yankai Lin, Jie Zhou, Ji-Rong Wen


Abstract
Large language models (LLMs) have shown incredible performance in completing various real-world tasks. The current paradigm of knowledge learning for LLMs is mainly based on learning from examples, in which LLMs learn the internal rule implicitly from a certain number of supervised examples. However, this learning paradigm may not well learn those complicated rules, especially when the training examples are limited. We are inspired that humans can learn the new tasks or knowledge in another way by learning from rules. That is, humans can learn new tasks or grasp new knowledge quickly and generalize well given only a detailed rule and a few optional examples. Therefore, in this paper, we aim to explore the feasibility of this new learning paradigm, which targets on encoding rule-based knowledge into LLMs. We further propose rule distillation, which first uses the strong in-context abilities of LLMs to extract the knowledge from the textual rules, and then explicitly encode the knowledge into the parameters of LLMs by learning from the above in-context signals produced inside the model. Our experiments show that making LLMs learn from rules by our method is much more efficient than example-based learning in both the sample size and generalization ability. Warning: This paper may contain examples with offensive content.
Anthology ID:
2025.coling-main.61
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
913–932
Language:
URL:
https://aclanthology.org/2025.coling-main.61/
DOI:
Bibkey:
Cite (ACL):
Wenkai Yang, Yankai Lin, Jie Zhou, and Ji-Rong Wen. 2025. Distilling Rule-based Knowledge into Large Language Models. In Proceedings of the 31st International Conference on Computational Linguistics, pages 913–932, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Distilling Rule-based Knowledge into Large Language Models (Yang et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.61.pdf