Evaluating Intention Detection Capability of Large Language Models in Persuasive Dialogues

Hiromasa Sakurai, Yusuke Miyao


Abstract
We investigate intention detection in persuasive multi-turn dialogs employing the largest available Large Language Models (LLMs).Much of the prior research measures the intention detection capability of machine learning models without considering the conversational history.To evaluate LLMs’ intention detection capability in conversation, we modified the existing datasets of persuasive conversation and created datasets using a multiple-choice paradigm.It is crucial to consider others’ perspectives through their utterances when engaging in a persuasive conversation, especially when making a request or reply that is inconvenient for others.This feature makes the persuasive dialogue suitable for the dataset of measuring intention detection capability.We incorporate the concept of ‘face acts,’ which categorize how utterances affect mental states.This approach enables us to measure intention detection capability by focusing on crucial intentions and to conduct comprehensible analysis according to intention types.
Anthology ID:
2024.acl-long.90
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1635–1657
Language:
URL:
https://aclanthology.org/2024.acl-long.90
DOI:
Bibkey:
Cite (ACL):
Hiromasa Sakurai and Yusuke Miyao. 2024. Evaluating Intention Detection Capability of Large Language Models in Persuasive Dialogues. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1635–1657, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Evaluating Intention Detection Capability of Large Language Models in Persuasive Dialogues (Sakurai & Miyao, ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.90.pdf