Trace-of-Thought Prompting: Investigating Prompt-Based Knowledge Distillation Through Question Decomposition

Tyler McDonald, Ali Emami


Abstract
Knowledge distillation allows smaller neural networks to emulate the performance of larger, teacher models with reduced computational demands. Traditional methods for Large Language Models (LLMs) often necessitate extensive fine-tuning, which limits their accessibility. To address this, we introduce Trace-of-Thought Prompting, a novel framework designed to distill critical reasoning capabilities from large-scale teacher models (over 8 billion parameters) to small-scale student models (up to 8 billion parameters). This approach leverages problem decomposition to enhance interpretability and facilitate human-in-the-loop interventions. Empirical evaluations on the GSM8K and MATH datasets show that student models achieve accuracy gains of up to 113% on GSM8K and 20% on MATH, with significant improvements particularly notable in smaller models like Llama 2 and Zephyr. Our results suggest a promising pathway for open-source, small-scale models to eventually serve as both students and teachers, potentially reducing our reliance on large-scale, proprietary models. Our code, featuring data analytics and testing scripts, is provided here: https://github.com/traceofthought/trace-of-thought-prompting/tree/main.
Anthology ID:
2024.acl-srw.35
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Xiyan Fu, Eve Fleisig
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
397–410
Language:
URL:
https://aclanthology.org/2024.acl-srw.35
DOI:
10.18653/v1/2024.acl-srw.35
Bibkey:
Cite (ACL):
Tyler McDonald and Ali Emami. 2024. Trace-of-Thought Prompting: Investigating Prompt-Based Knowledge Distillation Through Question Decomposition. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 397–410, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Trace-of-Thought Prompting: Investigating Prompt-Based Knowledge Distillation Through Question Decomposition (McDonald & Emami, ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-srw.35.pdf