Minimally-Supervised Relation Induction from Pre-trained Language Model

Lu Sun, Yongliang Shen, Weiming Lu


Abstract
Relation Induction is a very practical task in Natural Language Processing (NLP) area. In practical application scenarios, people want to induce more entity pairs having the same relation from only a few seed entity pairs. Thus, instead of the laborious supervised setting, in this paper, we focus on the minimally-supervised setting where only a couple of seed entity pairs per relation are provided. Although the conventional relation induction methods have made some success, their performance depends heavily on the quality of word embeddings. The great success of Pre-trained Language Models, such as BERT, changes the NLP area a lot, and they are proven to be able to better capture relation knowledge. In this paper, we propose a novel method to induce relation with BERT under the minimally-supervised setting. Specifically, we firstly extract proper templates from the corpus by using the mask-prediction task in BERT to build pseudo-sentences as the context of entity pairs. Then we use BERT attention weights to better represent the pseudo-sentences. In addition, We also use the IntegratedGradient of entity pairs to iteratively select better templates further. Finally, with the high-quality pseudo-sentences, we can train a better classifier for relation induction. Experiments onGoogle Analogy Test Sets (GATS), Bigger Analogy TestSet (BATS) and DiffVec demonstrate that our proposed method achieves state-of-the-art performance.
Anthology ID:
2022.findings-naacl.135
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1776–1786
Language:
URL:
https://aclanthology.org/2022.findings-naacl.135
DOI:
10.18653/v1/2022.findings-naacl.135
Bibkey:
Cite (ACL):
Lu Sun, Yongliang Shen, and Weiming Lu. 2022. Minimally-Supervised Relation Induction from Pre-trained Language Model. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1776–1786, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Minimally-Supervised Relation Induction from Pre-trained Language Model (Sun et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.135.pdf