WAE_RN: Integrating Wasserstein Autoencoder and Relational Network for Text Sequence

Xinxin Zhang, Xiaoming Liu, Guan Yang, Fangfang Li


Abstract
One challenge in Natural Language Processing (NLP) area is to learn semantic representation in different contexts. Recent works on pre-trained language model have received great attentions and have been proven as an effective technique. In spite of the success of pre-trained language model in many NLP tasks, the learned text representation only contains the correlation among the words in the sentence itself and ignores the implicit relationship between arbitrary tokens in the sequence. To address this problem, we focus on how to make our model effectively learn word representations that contain the relational information between any tokens of text sequences. In this paper, we propose to integrate the relational network(RN) into a Wasserstein autoencoder(WAE). Specifically, WAE and RN are used to better keep the semantic structurse and capture the relational information, respectively. Extensive experiments demonstrate that our proposed model achieves significant improvements over the traditional Seq2Seq baselines.
Anthology ID:
2020.ccl-1.109
Volume:
Proceedings of the 19th Chinese National Conference on Computational Linguistics
Month:
October
Year:
2020
Address:
Haikou, China
Editors:
Maosong Sun (孙茂松), Sujian Li (李素建), Yue Zhang (张岳), Yang Liu (刘洋)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
1172–1182
Language:
English
URL:
https://aclanthology.org/2020.ccl-1.109
DOI:
Bibkey:
Cite (ACL):
Xinxin Zhang, Xiaoming Liu, Guan Yang, and Fangfang Li. 2020. WAE_RN: Integrating Wasserstein Autoencoder and Relational Network for Text Sequence. In Proceedings of the 19th Chinese National Conference on Computational Linguistics, pages 1172–1182, Haikou, China. Chinese Information Processing Society of China.
Cite (Informal):
WAE_RN: Integrating Wasserstein Autoencoder and Relational Network for Text Sequence (Zhang et al., CCL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.ccl-1.109.pdf
Data
CoNLL-2012OntoNotes 5.0