Deep Bidirectional Transformers for Relation Extraction without Supervision

Yannis Papanikolaou, Ian Roberts, Andrea Pierleoni


Abstract
We present a novel framework to deal with relation extraction tasks in cases where there is complete lack of supervision, either in the form of gold annotations, or relations from a knowledge base. Our approach leverages syntactic parsing and pre-trained word embeddings to extract few but precise relations, which are then used to annotate a larger corpus, in a manner identical to distant supervision. The resulting data set is employed to fine tune a pre-trained BERT model in order to perform relation extraction. Empirical evaluation on four data sets from the biomedical domain shows that our method significantly outperforms two simple baselines for unsupervised relation extraction and, even if not using any supervision at all, achieves slightly worse results than the state-of-the-art in three out of four data sets. Importantly, we show that it is possible to successfully fine tune a large pretrained language model with noisy data, as opposed to previous works that rely on gold data for fine tuning.
Anthology ID:
D19-6108
Volume:
Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Colin Cherry, Greg Durrett, George Foster, Reza Haffari, Shahram Khadivi, Nanyun Peng, Xiang Ren, Swabha Swayamdipta
Venue:
WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
67–75
Language:
URL:
https://aclanthology.org/D19-6108
DOI:
10.18653/v1/D19-6108
Bibkey:
Cite (ACL):
Yannis Papanikolaou, Ian Roberts, and Andrea Pierleoni. 2019. Deep Bidirectional Transformers for Relation Extraction without Supervision. In Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019), pages 67–75, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Deep Bidirectional Transformers for Relation Extraction without Supervision (Papanikolaou et al., 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-6108.pdf