LSTDial: Enhancing Dialogue Generation via Long- and Short-Term Measurement Feedback

Guanghui Ye, Huan Zhao, Zixing Zhang, Xupeng Zha, Zhihua Jiang


Abstract
Generating high-quality responses is a key challenge for any open domain dialogue systems. However, even though there exist a variety of quality dimensions especially designed for dialogue evaluation (e.g., coherence and diversity scores), current dialogue systems rarely utilize them to guide the response generation during training. To alleviate this issue, we propose LSTDial (Long- and Short-Term Dialogue), a novel two-stage framework which generates and utilizes conversation evaluation as explicit feedback during training. Specifically, we fine-tune pre-trained dialogue systems through using turn-level quality feedback in the first stage and further train ever-improving dialogue agents through using dialogue-level quality feedback in the second stage. By using our approach on dialogue systems, capable of enabling dialogue generation with both short-term capabilities (generating more fluent, relevant and varied responses at the turn-level) and long-term capabilities (generating more coherent, engaging and informative responses at the dialogue-level). We implement LSTDial on four strong baseline models and experiment with two open-domain dialogue datasets. Experimental results show that LSTDial achieves significant improvement, enabling to generate better dialogue responses in terms of both human and automatic evaluation.
Anthology ID:
2024.naacl-long.326
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5857–5871
Language:
URL:
https://aclanthology.org/2024.naacl-long.326
DOI:
Bibkey:
Cite (ACL):
Guanghui Ye, Huan Zhao, Zixing Zhang, Xupeng Zha, and Zhihua Jiang. 2024. LSTDial: Enhancing Dialogue Generation via Long- and Short-Term Measurement Feedback. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 5857–5871, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
LSTDial: Enhancing Dialogue Generation via Long- and Short-Term Measurement Feedback (Ye et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.326.pdf
Copyright:
 2024.naacl-long.326.copyright.pdf