Investigating LLMs as Voting Assistants via Contextual Augmentation: A Case Study on the European Parliament Elections 2024

Ilias Chalkidis


Abstract
In light of the recent 2024 European Parliament elections, we are investigating if LLMs can be used as Voting Advice Applications (VAAs). We audit MISTRAL and MIXTRAL models and evaluate their accuracy in predicting the stance of political parties based on the latest “EU and I” voting assistance questionnaire. Furthermore, we explore alternatives to improve models’ performance by augmenting the input context via Retrieval-Augmented Generation (RAG) relying on web search, and Self-Reflection using staged conversations that aim to re-collect relevant content from the model’s internal memory. We find that MIXTRAL is highly accurate with an 82% accuracy on average with a significant performance disparity across different political groups (50-95%). Augmenting the input context with expert-curated information can lead to a significant boost of approx. 9%, which remains an open challenge for automated RAG approaches, even considering curated content.
Anthology ID:
2024.emnlp-main.312
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5455–5467
Language:
URL:
https://aclanthology.org/2024.emnlp-main.312
DOI:
10.18653/v1/2024.emnlp-main.312
Bibkey:
Cite (ACL):
Ilias Chalkidis. 2024. Investigating LLMs as Voting Assistants via Contextual Augmentation: A Case Study on the European Parliament Elections 2024. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 5455–5467, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Investigating LLMs as Voting Assistants via Contextual Augmentation: A Case Study on the European Parliament Elections 2024 (Chalkidis, EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.312.pdf