Proxy Indicators for the Quality of Open-domain Dialogues

Rostislav Nedelchev, Jens Lehmann, Ricardo Usbeck


Abstract
The automatic evaluation of open-domain dialogues remains a largely unsolved challenge. Despite the abundance of work done in the field, human judges have to evaluate dialogues’ quality. As a consequence, performing such evaluations at scale is usually expensive. This work investigates using a deep-learning model trained on the General Language Understanding Evaluation (GLUE) benchmark to serve as a quality indication of open-domain dialogues. The aim is to use the various GLUE tasks as different perspectives on judging the quality of conversation, thus reducing the need for additional training data or responses that serve as quality references. Due to this nature, the method can infer various quality metrics and can derive a component-based overall score. We achieve statistically significant correlation coefficients of up to 0.7.
Anthology ID:
2021.emnlp-main.618
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7834–7855
Language:
URL:
https://aclanthology.org/2021.emnlp-main.618
DOI:
10.18653/v1/2021.emnlp-main.618
Bibkey:
Cite (ACL):
Rostislav Nedelchev, Jens Lehmann, and Ricardo Usbeck. 2021. Proxy Indicators for the Quality of Open-domain Dialogues. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 7834–7855, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Proxy Indicators for the Quality of Open-domain Dialogues (Nedelchev et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.618.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.618.mp4
Code
 smartdataanalytics/proxy_indicators
Data
CoLAGLUEMRPCMultiNLIQNLISSTSST-2Topical-ChatUSR-PersonaChatUSR-TopicalChatWSC