Strategize Before Teaching: A Conversational Tutoring System with Pedagogy Self-Distillation

Lingzhi Wang, Mrinmaya Sachan, Xingshan Zeng, Kam-Fai Wong


Abstract
Conversational tutoring systems (CTSs) aim to help students master educational material with natural language interaction in the form of a dialog. CTSs have become a key pillar in educational data mining research. A key challenge in CTSs is to engage the student in the conversation while exposing them to a diverse set of teaching strategies, akin to a human teacher, thereby, helping them learn in the process. Different from previous work that generates responses given the strategies as input, we propose to jointly predict teaching strategies and generate tutor responses accordingly, which fits a more realistic application scenario. We benchmark several competitive models on three dialog tutoring datasets and propose a unified framework that combines teaching response generation and pedagogical strategy prediction, where a self-distillation mechanism is adopted to guide the teaching strategy learning and facilitate tutor response generation. Our experiments and analyses shed light on how teaching strategies affect dialog tutoring.
Anthology ID:
2023.findings-eacl.170
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2268–2274
Language:
URL:
https://aclanthology.org/2023.findings-eacl.170
DOI:
10.18653/v1/2023.findings-eacl.170
Bibkey:
Cite (ACL):
Lingzhi Wang, Mrinmaya Sachan, Xingshan Zeng, and Kam-Fai Wong. 2023. Strategize Before Teaching: A Conversational Tutoring System with Pedagogy Self-Distillation. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2268–2274, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Strategize Before Teaching: A Conversational Tutoring System with Pedagogy Self-Distillation (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.170.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.170.mp4