ZS-BERT: Towards Zero-Shot Relation Extraction with Attribute Representation Learning

Chih-Yao Chen, Cheng-Te Li


Abstract
While relation extraction is an essential task in knowledge acquisition and representation, and new-generated relations are common in the real world, less effort is made to predict unseen relations that cannot be observed at the training stage. In this paper, we formulate the zero-shot relation extraction problem by incorporating the text description of seen and unseen relations. We propose a novel multi-task learning model, Zero-Shot BERT (ZS-BERT), to directly predict unseen relations without hand-crafted attribute labeling and multiple pairwise classifications. Given training instances consisting of input sentences and the descriptions of their seen relations, ZS-BERT learns two functions that project sentences and relations into an embedding space by jointly minimizing the distances between them and classifying seen relations. By generating the embeddings of unseen relations and new-coming sentences based on such two functions, we use nearest neighbor search to obtain the prediction of unseen relations. Experiments conducted on two well-known datasets exhibit that ZS-BERT can outperform existing methods by at least 13.54% improvement on F1 score.
Anthology ID:
2021.naacl-main.272
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3470–3479
Language:
URL:
https://aclanthology.org/2021.naacl-main.272
DOI:
10.18653/v1/2021.naacl-main.272
Bibkey:
Cite (ACL):
Chih-Yao Chen and Cheng-Te Li. 2021. ZS-BERT: Towards Zero-Shot Relation Extraction with Attribute Representation Learning. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 3470–3479, Online. Association for Computational Linguistics.
Cite (Informal):
ZS-BERT: Towards Zero-Shot Relation Extraction with Attribute Representation Learning (Chen & Li, NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.272.pdf
Code
 dinobby/ZS-BERT
Data
Wiki-ZSLFewRel