Modeling Compositionality with Dependency Graph for Dialogue Generation

Xiaofeng Chen, Yirong Chen, Xiaofen Xing, Xiangmin Xu, Wenjing Han, Qianfeng Tie


Abstract
Because of the compositionality of natural language, syntactic structure which contains the information about the relationship between words is a key factor for semantic understanding. However, the widely adopted Transformer is hard to learn the syntactic structure effectively in dialogue generation tasks. To explicitly model the compositionaity of language in Transformer Block, we restrict the information flow between words by constructing directed dependency graph and propose Dependency Relation Attention (DRA). Experimental results demonstrate that DRA can further improve the performance of state-of-the-art models for dialogue generation.
Anthology ID:
2022.suki-1.2
Volume:
Proceedings of the Workshop on Structured and Unstructured Knowledge Integration (SUKI)
Month:
July
Year:
2022
Address:
Seattle, USA
Editors:
Wenhu Chen, Xinyun Chen, Zhiyu Chen, Ziyu Yao, Michihiro Yasunaga, Tao Yu, Rui Zhang
Venue:
SUKI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9–16
Language:
URL:
https://aclanthology.org/2022.suki-1.2
DOI:
10.18653/v1/2022.suki-1.2
Bibkey:
Cite (ACL):
Xiaofeng Chen, Yirong Chen, Xiaofen Xing, Xiangmin Xu, Wenjing Han, and Qianfeng Tie. 2022. Modeling Compositionality with Dependency Graph for Dialogue Generation. In Proceedings of the Workshop on Structured and Unstructured Knowledge Integration (SUKI), pages 9–16, Seattle, USA. Association for Computational Linguistics.
Cite (Informal):
Modeling Compositionality with Dependency Graph for Dialogue Generation (Chen et al., SUKI 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.suki-1.2.pdf