FinePrompt: Unveiling the Role of Finetuned Inductive Bias on Compositional Reasoning in GPT-4

Jeonghwan Kim, Giwon Hong, Sung-Hyon Myaeng, Joyce Whang


Abstract
Compositional reasoning across texts has been a long-standing challenge in natural language processing. With large language models like GPT-4 taking over the field, prompting techniques such as chain-of-thought (CoT) were proposed to unlock compositional, multi-step reasoning capabilities of LLMs. Despite their success, the prompts demand significant human effort to discover and validate them. Our work draws attention to the idea of transferring task-specific inductive biases from finetuned models to prompts, as a way of improving GPT-4’s compositional reasoning capabilities. To leverage these inductive biases, we formulate prompt templates to ease the transfer of inductive biases. The experimental results on multi-hop question answering and numerical reasoning over text show that our proposed prompt scheme shows competitive zero-shot and few-shot performances compared to existing prompts on complicated reasoning tasks, highlighting the importance of adopting the validated biases of the previous paradigm.
Anthology ID:
2023.findings-emnlp.245
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3763–3775
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.245
DOI:
10.18653/v1/2023.findings-emnlp.245
Bibkey:
Cite (ACL):
Jeonghwan Kim, Giwon Hong, Sung-Hyon Myaeng, and Joyce Whang. 2023. FinePrompt: Unveiling the Role of Finetuned Inductive Bias on Compositional Reasoning in GPT-4. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 3763–3775, Singapore. Association for Computational Linguistics.
Cite (Informal):
FinePrompt: Unveiling the Role of Finetuned Inductive Bias on Compositional Reasoning in GPT-4 (Kim et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.245.pdf