Fenglong Su


2023

pdf bib
Temporal Extrapolation and Knowledge Transfer for Lifelong Temporal Knowledge Graph Reasoning
Zhongwu Chen | Chengjin Xu | Fenglong Su | Zhen Huang | Yong Dou
Findings of the Association for Computational Linguistics: EMNLP 2023

Real-world Temporal Knowledge Graphs keep growing with time and new entities and facts emerge continually, necessitating a model that can extrapolate to future timestamps and transfer knowledge for new components. Therefore, our work first dives into this more realistic issue, lifelong TKG reasoning, where existing methods can only address part of the challenges. Specifically, we formulate lifelong TKG reasoning as a temporal-path-based reinforcement learning (RL) framework. Then, we add temporal displacement into the action space of RL to extrapolate for the future and further propose a temporal-rule-based reward shaping to guide the training. To transfer and update knowledge, we design a new edge-aware message passing module, where the embeddings of new entities and edges are inductive. We conduct extensive experiments on three newly constructed benchmarks for lifelong TKG reasoning. Experimental results show the outperforming effectiveness of our model against all well-adapted baselines.

2022

pdf bib
基于知识监督的标签降噪实体对齐(Refined De-noising for Labeled Entity Alignment from Auxiliary Evidence Knowledge)
Fenglong Su (苏丰龙) | Ning Jing (景宁)
Proceedings of the 21st Chinese National Conference on Computational Linguistics

“大多数现有的实体对齐解决方案都依赖于干净的标记数据来训练模型,很少关注种子噪声。为了解决实体对齐中的噪声问题,本文提出了一个标签降噪框架,在实体对齐中注入辅助知识和附带监督,以纠正标记和引导过程中的种子错误。特别是,考虑到以前基于邻域嵌入方法的弱点,本文应用了一种新的对偶关系注意力匹配编码器来加速知识图谱的结构学习,同时使用辅助知识来弥补结构表征的不足。然后,通过对抗训练来执行弱监督标签降噪。对于误差累积的问题,本文进一步使用对齐精化模块来提高模型的性能。实验结果表明,所提的框架能够轻松应对含噪声环境下的实体对齐问题,在多个真实数据集上的对齐准确性和噪声辨别能力始终优于其他基线方法。”

2021

pdf bib
Time-aware Graph Neural Network for Entity Alignment between Temporal Knowledge Graphs
Chengjin Xu | Fenglong Su | Jens Lehmann
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing

Entity alignment aims to identify equivalent entity pairs between different knowledge graphs (KGs). Recently, the availability of temporal KGs (TKGs) that contain time information created the need for reasoning over time in such TKGs. Existing embedding-based entity alignment approaches disregard time information that commonly exists in many large-scale KGs, leaving much room for improvement. In this paper, we focus on the task of aligning entity pairs between TKGs and propose a novel Time-aware Entity Alignment approach based on Graph Neural Networks (TEA-GNN). We embed entities, relations and timestamps of different KGs into a vector space and use GNNs to learn entity representations. To incorporate both relation and time information into the GNN structure of our model, we use a self-attention mechanism which assigns different weights to different nodes with orthogonal transformation matrices computed from embeddings of the relevant relations and timestamps in a neighborhood. Experimental results on multiple real-world TKG datasets show that our method significantly outperforms the state-of-the-art methods due to the inclusion of time information.