Boosting Scientific Concepts Understanding: Can Analogy from Teacher Models Empower Student Models?

Siyu Yuan, Cheng Jiayang, Lin Qiu, Deqing Yang


Abstract
Analogical reasoning plays a critical role in human cognition, enabling us to understand new concepts by associating them with familiar ones. Previous research in the AI community has mainly focused on identifying and generating analogies and then examining their quality under human evaluation, which overlooks the practical application of these analogies in real-world settings. Inspired by the human education process, in this paper, we propose to investigate how analogies created by teacher language models (LMs) can assist student LMs in understanding scientific concepts, thereby aligning more closely with practical scenarios. Our results suggest that free-form analogies can indeed aid LMs in understanding concepts. Additionally, analogies generated by student LMs can improve their own performance on scientific question answering, demonstrating their capability to use analogies for self-learning new knowledge. Resources are available athttps://github.com/siyuyuan/SCUA.
Anthology ID:
2024.emnlp-main.346
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6026–6036
Language:
URL:
https://aclanthology.org/2024.emnlp-main.346
DOI:
Bibkey:
Cite (ACL):
Siyu Yuan, Cheng Jiayang, Lin Qiu, and Deqing Yang. 2024. Boosting Scientific Concepts Understanding: Can Analogy from Teacher Models Empower Student Models?. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 6026–6036, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Boosting Scientific Concepts Understanding: Can Analogy from Teacher Models Empower Student Models? (Yuan et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.346.pdf