Graph Hawkes Transformer for Extrapolated Reasoning on Temporal Knowledge Graphs
Haohai Sun | Shangyi Geng | Jialun Zhong | Han Hu | Kun He
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Temporal Knowledge Graph (TKG) reasoning has attracted increasing attention due to its enormous potential value, and the critical issue is how to model the complex temporal structure information effectively. Recent studies use the method of encoding graph snapshots into hidden vector space and then performing heuristic deductions, which perform well on the task of entity prediction. However, these approaches cannot predict when an event will occur and have the following limitations: 1) there are many facts not related to the query that can confuse the model; 2) there exists information forgetting caused by long-term evolutionary processes. To this end, we propose a Graph Hawkes Transformer (GHT) for both TKG entity prediction and time prediction tasks in the future time. In GHT, there are two variants of Transformer, which capture the instantaneous structural information and temporal evolution information, respectively, and a new relational continuous-time encoding function to facilitate feature evolution with the Hawkes process. Extensive experiments on four public datasets demonstrate its superior performance, especially on long-term evolutionary tasks.
Imbalanced Chinese Multi-label Text Classification Based on Alternating Attention
Hongliang Bi | Han Hu | Pengyuan Liu
Proceedings of the 34th Pacific Asia Conference on Language, Information and Computation
小样本关系分类研究综述(Few-Shot Relation Classification: A Survey)
Han Hu (胡晗) | Pengyuan Liu (刘鹏远)
Proceedings of the 19th Chinese National Conference on Computational Linguistics
- Pengyuan Liu 2
- Hongliang Bi 1
- Haohai Sun 1
- Shangyi Geng 1
- Jialun Zhong 1
- show all...