KELE: A Multi-Agent Framework for Structured Socratic Teaching with Large Language Models

Xian Peng, Pan Yuan, Dong Li, Junlong Cheng, Qin Fang, Zhi Liu


Abstract
Socratic teaching, known for its emphasis on heuristic questioning and deep thinking, has demonstrated significant advantages in promoting students’ cognitive development. However, traditional Socratic teaching places high demands on teachers’ expertise and real-time feedback capabilities, making it difficult to scale in large educational settings. Recent breakthroughs in large language models (LLMs) in natural language generation and dialogue comprehension offer the potential for automated Socratic teaching. In this paper, we propose Knowledge-Enlightened Learning Enhanced by LLMs (KELE), a novel multi-agent framework for structured Socratic teaching with LLMs. KELE constructs a structured Socratic teaching rule system (SocRule) and a “consultant–teacher” multi-agent collaborative teaching mechanism, in which two LLMs respectively take charge of teaching planning and execution, ensuring a logically coherent and hierarchically structured Socratic teaching process. We also construct SocratDataset, a structured Socratic teaching dataset covering 34 teaching strategies and over 42,000 dialogue turns, and train SocratTeachLLM, a specialized LLM for Socratic teaching tasks. Additionally, we build a comprehensive Socratic teaching quality evaluation system for LLMs, covering 9 dimensions from single-turn dialogue to multi-turn teaching processes. Experimental results show that SocratTeachLLM significantly outperforms GPT-4o, which has a much larger parameter size, across all Socratic teaching capabilities.
Anthology ID:
2025.findings-emnlp.888
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16342–16362
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.888/
DOI:
Bibkey:
Cite (ACL):
Xian Peng, Pan Yuan, Dong Li, Junlong Cheng, Qin Fang, and Zhi Liu. 2025. KELE: A Multi-Agent Framework for Structured Socratic Teaching with Large Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 16342–16362, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
KELE: A Multi-Agent Framework for Structured Socratic Teaching with Large Language Models (Peng et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.888.pdf
Checklist:
 2025.findings-emnlp.888.checklist.pdf