Envisioning Future from the Past: Hierarchical Duality Learning for Multi-Turn Dialogue Generation

Ang Lv, Jinpeng Li, Shufang Xie, Rui Yan


Abstract
In this paper, we define a widely neglected property in dialogue text, duality, which is a hierarchical property that is reflected in human behaviours in daily conversations: Based on the logic in a conversation (or a sentence), people can infer follow-up utterances (or tokens) based on the previous text, and vice versa. We propose a hierarchical duality learning for dialogue (HDLD) to simulate this human cognitive ability, for generating high quality responses that connect both previous and follow-up dialogues. HDLD utilizes hierarchical dualities at token hierarchy and utterance hierarchy. HDLD maximizes the mutual information between past and future utterances. Thus, even if future text is invisible during inference, HDLD is capable of estimating future information implicitly based on dialogue history and generates both coherent and informative responses. In contrast to previous approaches that solely utilize future text as auxiliary information to encode during training, HDLD leverages duality to enable interaction between dialogue history and the future. This enhances the utilization of dialogue data, leading to the improvement in both automatic and human evaluation.
Anthology ID:
2023.acl-long.407
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7382–7394
Language:
URL:
https://aclanthology.org/2023.acl-long.407
DOI:
10.18653/v1/2023.acl-long.407
Bibkey:
Cite (ACL):
Ang Lv, Jinpeng Li, Shufang Xie, and Rui Yan. 2023. Envisioning Future from the Past: Hierarchical Duality Learning for Multi-Turn Dialogue Generation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7382–7394, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Envisioning Future from the Past: Hierarchical Duality Learning for Multi-Turn Dialogue Generation (Lv et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.407.pdf
Video:
 https://aclanthology.org/2023.acl-long.407.mp4