Xin Guo
2021
基于迭代信息传递和滑动窗口注意力的问题生成模型研究(Question Generation Model Based on Iterative Message Passing and Sliding Windows Hierarchical Attention)
Qian Chen (陈千)
|
Xiaoying Gao (高晓影)
|
Suge Wang (王素格)
|
Xin Guo (郭鑫)
Proceedings of the 20th Chinese National Conference on Computational Linguistics
知识图谱问题生成任务是从给定的知识图谱中生成与其相关的问题。目前,知识图谱问题生成模型主要使用基于RNN或Transformer对知识图谱子图进行编码,但这种方式丢失了显式的图结构化信息,在解码器中忽视了局部信息对节点的重要性。本文提出迭代信息传递图编码器来编码子图,获取子图显式的图结构化信息,此外,我们还使用滑动窗口注意力机制提高RNN解码器,提升子图局部信息对节点的重要度。从WQ和PQ数据集上的实验结果看,我们提出的模型比KTG模型在BLEU4指标上分别高出2.16和15.44,证明了该模型的有效性。
2020
Continual Learning Long Short Term Memory
Xin Guo
|
Yu Tian
|
Qinghan Xue
|
Panos Lampropoulos
|
Steven Eliuk
|
Kenneth Barner
|
Xiaolong Wang
Findings of the Association for Computational Linguistics: EMNLP 2020
Catastrophic forgetting in neural networks indicates the performance decreasing of deep learning models on previous tasks while learning new tasks. To address this problem, we propose a novel Continual Learning Long Short Term Memory (CL-LSTM) cell in Recurrent Neural Network (RNN) in this paper. CL-LSTM considers not only the state of each individual task’s output gates but also the correlation of the states between tasks, so that the deep learning models can incrementally learn new tasks without catastrophically forgetting previously tasks. Experimental results demonstrate significant improvements of CL-LSTM over state-of-the-art approaches on spoken language understanding (SLU) tasks.
Search
Co-authors
- Qian Chen 1
- Xiaoying Gao (高晓影) 1
- Suge Wang 1
- Yu Tian 1
- Qinghan Xue 1
- show all...