The Moral Debater: A Study on the Computational Generation of Morally Framed Arguments

Milad Alshomary, Roxanne El Baff, Timon Gurcke, Henning Wachsmuth


Abstract
An audience’s prior beliefs and morals are strong indicators of how likely they will be affected by a given argument. Utilizing such knowledge can help focus on shared values to bring disagreeing parties towards agreement. In argumentation technology, however, this is barely exploited so far. This paper studies the feasibility of automatically generating morally framed arguments as well as their effect on different audiences. Following the moral foundation theory, we propose a system that effectively generates arguments focusing on different morals. In an in-depth user study, we ask liberals and conservatives to evaluate the impact of these arguments. Our results suggest that, particularly when prior beliefs are challenged, an audience becomes more affected by morally framed arguments.
Anthology ID:
2022.acl-long.601
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8782–8797
Language:
URL:
https://aclanthology.org/2022.acl-long.601
DOI:
10.18653/v1/2022.acl-long.601
Bibkey:
Cite (ACL):
Milad Alshomary, Roxanne El Baff, Timon Gurcke, and Henning Wachsmuth. 2022. The Moral Debater: A Study on the Computational Generation of Morally Framed Arguments. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 8782–8797, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
The Moral Debater: A Study on the Computational Generation of Morally Framed Arguments (Alshomary et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.601.pdf
Software:
 2022.acl-long.601.software.zip
Video:
 https://aclanthology.org/2022.acl-long.601.mp4
Code
 webis-de/acl-22