CASE: Commonsense-Augmented Score with an Expanded Answer Space

Wenkai Chen, Sahithya Ravi, Vered Shwartz


Abstract
LLMs have demonstrated impressive zero-shot performance on NLP tasks thanks to the knowledge they acquired in their training. In multiple-choice QA tasks, the LM probabilities are used as an imperfect measure of the plausibility of each answer choice. One of the major limitations of the basic score is that it treats all words as equally important. We propose CASE, a Commonsense-Augmented Score with an Expanded Answer Space. CASE addresses this limitation by assigning importance weights for individual words based on their semantic relations to other words in the input. The dynamic weighting approach outperforms basic LM scores, not only because it reduces noise from unimportant words, but also because it informs the model of implicit commonsense knowledge that may be useful for answering the question. We then also follow prior work in expanding the answer space by generating lexically-divergent answers that are conceptually-similar to the choices. When combined with answer space expansion, our method outperforms strong baselines on 5 commonsense benchmarks. We further show these two approaches are complementary and may be especially beneficial when using smaller LMs.
Anthology ID:
2023.findings-emnlp.180
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2732–2744
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.180
DOI:
10.18653/v1/2023.findings-emnlp.180
Bibkey:
Cite (ACL):
Wenkai Chen, Sahithya Ravi, and Vered Shwartz. 2023. CASE: Commonsense-Augmented Score with an Expanded Answer Space. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 2732–2744, Singapore. Association for Computational Linguistics.
Cite (Informal):
CASE: Commonsense-Augmented Score with an Expanded Answer Space (Chen et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.180.pdf