Toward In-Context Teaching: Adapting Examples to Students’ Misconceptions

Alexis Ross, Jacob Andreas


Abstract
When a teacher provides examples for a student to study, these examples must be informative, enabling a student to progress from their current state toward a target concept or skill. Good teachers must therefore simultaneously infer what students already know and adapt their teaching to students’ changing state of knowledge. There is increasing interest in using computational models, particularly large language models, as pedagogical tools. As students, language models in particular have shown a remarkable ability to adapt to new tasks given small numbers of examples. But how effectively can these models adapt as teachers to students of different types? To study this question, we introduce a suite of models and evaluation methods we call AdapT. AdapT has two components: (1) a collection of simulated Bayesian student models that can be used for evaluation of automated teaching methods; (2) a platform for evaluation with human students, to characterize the real-world effectiveness of these methods. We additionally introduce (3) AToM, a new probabilistic method for adaptive teaching that jointly infers students’ past beliefs and optimizes for the correctness of future beliefs. In evaluations of simulated students across three learning domains (fraction arithmetic, English morphology, function learning), AToM systematically outperforms LLM-based and standard Bayesian teaching methods. In human experiments, both AToM and LLMs outperform non-adaptive random example selection. Our results highlight both the difficulty of the adaptive teaching task and the potential of learned adaptive methods for solving it.
Anthology ID:
2024.acl-long.718
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13283–13310
Language:
URL:
https://aclanthology.org/2024.acl-long.718
DOI:
Bibkey:
Cite (ACL):
Alexis Ross and Jacob Andreas. 2024. Toward In-Context Teaching: Adapting Examples to Students’ Misconceptions. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 13283–13310, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Toward In-Context Teaching: Adapting Examples to Students’ Misconceptions (Ross & Andreas, ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.718.pdf