2025
pdf
bib
abs
Towards Robust Few-Shot Relation Classification: Incorporating Relation Description with Agreement
Mengting Hu
|
Jianfeng Wu
|
Ming Jiang
|
Yalan Xie
|
Zhunheng Wang
|
Rui Ying
|
Xiaoyi Liu
|
Ruixuan Xu
|
Hang Gao
|
Renhong Cheng
Findings of the Association for Computational Linguistics: EMNLP 2025
Few-shot relation classification aims to recognize the relation between two mentioned entities, with the help of only a few support samples. However, a few samples tend to be limited for tackling unlimited queries. If a query cannot find references from the support samples, it is defined as none-of-the-above (NOTA). Previous works mainly focus on how to distinguish N+1 categories, including N known relations and one NOTA class, to accurately recognize relations. However, the robustness towards various NOTA rates, i.e. the proportion of NOTA among queries, is under investigation. In this paper, we target the robustness and propose a simple but effective framework. Specifically, we introduce relation descriptions as external knowledge to enhance the model’s comprehension of the relation semantics. Moreover, we further promote robustness by proposing a novel agreement loss. It is designed for seeking decision consistency between the instance-level decision, i.e. support samples, and relation-level decision, i.e. relation descriptions. Extensive experimental results demonstrate that the proposed framework outperforms strong baselines while being robust against various NOTA rates. The code is released on GitHub at https://github.com/Pisces-29/RoFRC.
2024
pdf
bib
abs
Simple but Effective Compound Geometric Operations for Temporal Knowledge Graph Completion
Rui Ying
|
Mengting Hu
|
Jianfeng Wu
|
Yalan Xie
|
Xiaoyi Liu
|
Zhunheng Wang
|
Ming Jiang
|
Hang Gao
|
Linlin Zhang
|
Renhong Cheng
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Temporal knowledge graph completion aims to infer the missing facts in temporal knowledge graphs. Current approaches usually embed factual knowledge into continuous vector space and apply geometric operations to learn potential patterns in temporal knowledge graphs. However, these methods only adopt a single operation, which may have limitations in capturing the complex temporal dynamics present in temporal knowledge graphs. Therefore, we propose a simple but effective method, i.e. TCompoundE, which is specially designed with two geometric operations, including time-specific and relation-specific operations. We provide mathematical proofs to demonstrate the ability of TCompoundE to encode various relation patterns. Experimental results show that our proposed model significantly outperforms existing temporal knowledge graph embedding models. Our code is available at https://github.com/nk-ruiying/TCompoundE.
pdf
bib
abs
ECoK: Emotional Commonsense Knowledge Graph for Mining Emotional Gold
Zhunheng Wang
|
Xiaoyi Liu
|
Mengting Hu
|
Rui Ying
|
Ming Jiang
|
Jianfeng Wu
|
Yalan Xie
|
Hang Gao
|
Renhong Cheng
Findings of the Association for Computational Linguistics: ACL 2024
The demand for understanding and expressing emotions in the field of natural language processing is growing rapidly. Knowledge graphs, as an important form of knowledge representation, have been widely utilized in various emotion-related tasks. However, existing knowledge graphs mainly focus on the representation and reasoning of general factual knowledge, while there are still significant deficiencies in the understanding and reasoning of emotional knowledge. In this work, we construct a comprehensive and accurate emotional commonsense knowledge graph, ECoK. We integrate cutting-edge theories from multiple disciplines such as psychology, cognitive science, and linguistics, and combine techniques such as large language models and natural language processing. By mining a large amount of text, dialogue, and sentiment analysis data, we construct rich emotional knowledge and establish the knowledge generation model COMET-ECoK. Experimental results show that ECoK contains high-quality emotional reasoning knowledge, and the performance of our knowledge generation model surpasses GPT-4-Turbo, which can help downstream tasks better understand and reason about emotions. Our data and code is available from https://github.com/ZornWang/ECoK.