Mirror-Consistency: Harnessing Inconsistency in Majority Voting

Siyuan Huang, Zhiyuan Ma, Jintao Du, Changhua Meng, Weiqiang Wang, Zhouhan Lin


Abstract
Self-Consistency, a widely-used decoding strategy, significantly boosts the reasoning capabilities of Large Language Models (LLMs). However, it depends on the plurality voting rule, which focuses on the most frequent answer while overlooking all other minority responses. These inconsistent minority views often illuminate areas of uncertainty within the model’s generation process. To address this limitation, we present Mirror-Consistency, an enhancement of the standard Self-Consistency approach. Our method incorporates a ‘reflective mirror’ into the self-ensemble decoding process and enables LLMs to critically examine inconsistencies among multiple generations. Additionally, just as humans use the mirror to better understand themselves, we propose using Mirror-Consistency to enhance the sample-based confidence calibration methods, which helps to mitigate issues of overconfidence. Our experimental results demonstrate that Mirror-Consistency yields superior performance in both reasoning accuracy and confidence calibration compared to Self-Consistency.
Anthology ID:
2024.findings-emnlp.135
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2408–2420
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.135
DOI:
Bibkey:
Cite (ACL):
Siyuan Huang, Zhiyuan Ma, Jintao Du, Changhua Meng, Weiqiang Wang, and Zhouhan Lin. 2024. Mirror-Consistency: Harnessing Inconsistency in Majority Voting. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 2408–2420, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Mirror-Consistency: Harnessing Inconsistency in Majority Voting (Huang et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.135.pdf
Software:
 2024.findings-emnlp.135.software.zip
Data:
 2024.findings-emnlp.135.data.zip