Contextualizing Argument Quality Assessment with Relevant Knowledge

Darshan Deshpande, Zhivar Sourati, Filip Ilievski, Fred Morstatter


Abstract
Automatic assessment of the quality of arguments has been recognized as a challenging task with significant implications for misinformation and targeted speech. While real-world arguments are tightly anchored in context, existing computational methods analyze their quality in isolation, which affects their accuracy and generalizability. We propose SPARK: a novel method for scoring argument quality based on contextualization via relevant knowledge. We devise four augmentations that leverage large language models to provide feedback, infer hidden assumptions, supply a similar-quality argument, or give a counter-argument. SPARK uses a dual-encoder Transformer architecture to enable the original argument and its augmentation to be considered jointly. Our experiments in both in-domain and zero-shot setups show that SPARK consistently outperforms existing techniques across multiple metrics
Anthology ID:
2024.naacl-short.28
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
316–326
Language:
URL:
https://aclanthology.org/2024.naacl-short.28
DOI:
10.18653/v1/2024.naacl-short.28
Bibkey:
Cite (ACL):
Darshan Deshpande, Zhivar Sourati, Filip Ilievski, and Fred Morstatter. 2024. Contextualizing Argument Quality Assessment with Relevant Knowledge. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers), pages 316–326, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Contextualizing Argument Quality Assessment with Relevant Knowledge (Deshpande et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-short.28.pdf