DDPrompt: Differential Diversity Prompting in Large Language Models

Lin Mu, Wenhao Zhang, Yiwen Zhang, Peiquan Jin


Abstract
Large Language Models (LLMs) have shown that their reasoning ability could be enhanced through approaches like Chain-of-Thought (CoT) prompting. However, these methods use single prompts for different types of questions and do not design appropriate prompts for questions with different characteristics. In this paper, we aim to explore a methodology that generates differentially diverse reasoning paths for different types of questions. To achieve this, we propose a novel prompting strategy called Differential Diversity Prompting (DDPrompt). Firstly, we generate the optimal prompts collection based on question characteristics. Then, we use this optimal prompts collection to generate multiple answers for a question and choose the final answer by voting. We evaluated DDPrompt on twelve reasoning benchmarks and significant improvement in the performance of LLMs on complex reasoning tasks (e.g., GSM8K 75%->84%, Tracking Shuffled Objects (68.8%->83.9%))
Anthology ID:
2024.acl-short.17
Original:
2024.acl-short.17v1
Version 2:
2024.acl-short.17v2
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
168–174
Language:
URL:
https://aclanthology.org/2024.acl-short.17
DOI:
10.18653/v1/2024.acl-short.17
Bibkey:
Cite (ACL):
Lin Mu, Wenhao Zhang, Yiwen Zhang, and Peiquan Jin. 2024. DDPrompt: Differential Diversity Prompting in Large Language Models. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 168–174, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
DDPrompt: Differential Diversity Prompting in Large Language Models (Mu et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-short.17.pdf