Context-Aware Adapter Tuning for Few-Shot Relation Learning in Knowledge Graphs

Liu Ran, Zhongzhou Liu, Xiaoli Li, Yuan Fang


Abstract
Knowledge graphs (KGs) are instrumental in various real-world applications, yet they often suffer from incompleteness due to missing relations. To predict instances for novel relations with limited training examples, few-shot relation learning approaches have emerged, utilizing techniques such as meta-learning. However, the assumption is that novel relations in meta-testing and base relations in meta-training are independently and identically distributed, which may not hold in practice. To address the limitation, we propose RelAdapter, a context-aware adapter for few-shot relation learning in KGs designed to enhance the adaptation process in meta-learning. First, RelAdapter is equipped with a lightweight adapter module that facilitates relation-specific, tunable adaptation of meta-knowledge in a parameter-efficient manner. Second, RelAdapter is enriched with contextual information about the target relation, enabling enhanced adaptation to each distinct relation. Extensive experiments on three benchmark KGs validate the superiority of RelAdapter over state-of-the-art methods.
Anthology ID:
2024.emnlp-main.970
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
17525–17537
Language:
URL:
https://aclanthology.org/2024.emnlp-main.970
DOI:
Bibkey:
Cite (ACL):
Liu Ran, Zhongzhou Liu, Xiaoli Li, and Yuan Fang. 2024. Context-Aware Adapter Tuning for Few-Shot Relation Learning in Knowledge Graphs. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 17525–17537, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Context-Aware Adapter Tuning for Few-Shot Relation Learning in Knowledge Graphs (Ran et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.970.pdf
Software:
 2024.emnlp-main.970.software.zip
Data:
 2024.emnlp-main.970.data.zip