SaFeRDialogues: Taking Feedback Gracefully after Conversational Safety Failures

Megan Ung, Jing Xu, Y-Lan Boureau


Abstract
Current open-domain conversational models can easily be made to talk in inadequate ways. Online learning from conversational feedback given by the conversation partner is a promising avenue for a model to improve and adapt, so as to generate fewer of these safety failures. However, current state-of-the-art models tend to react to feedback with defensive or oblivious responses. This makes for an unpleasant experience and may discourage conversation partners from giving feedback in the future. This work proposes SaFeRDialogues, a task and dataset of graceful responses to conversational feedback about safety failures. We collect a dataset of 8k dialogues demonstrating safety failures, feedback signaling them, and a response acknowledging the feedback. We show how fine-tuning on this dataset results in conversations that human raters deem considerably more likely to lead to a civil conversation, without sacrificing engagingness or general conversational ability.
Anthology ID:
2022.acl-long.447
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6462–6481
Language:
URL:
https://aclanthology.org/2022.acl-long.447
DOI:
10.18653/v1/2022.acl-long.447
Bibkey:
Cite (ACL):
Megan Ung, Jing Xu, and Y-Lan Boureau. 2022. SaFeRDialogues: Taking Feedback Gracefully after Conversational Safety Failures. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6462–6481, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
SaFeRDialogues: Taking Feedback Gracefully after Conversational Safety Failures (Ung et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.447.pdf
Video:
 https://aclanthology.org/2022.acl-long.447.mp4