Improving BERT Model Using Contrastive Learning for Biomedical Relation Extraction

Peng Su, Yifan Peng, K. Vijay-Shanker


Abstract
Contrastive learning has been used to learn a high-quality representation of the image in computer vision. However, contrastive learning is not widely utilized in natural language processing due to the lack of a general method of data augmentation for text data. In this work, we explore the method of employing contrastive learning to improve the text representation from the BERT model for relation extraction. The key knob of our framework is a unique contrastive pre-training step tailored for the relation extraction tasks by seamlessly integrating linguistic knowledge into the data augmentation. Furthermore, we investigate how large-scale data constructed from the external knowledge bases can enhance the generality of contrastive pre-training of BERT. The experimental results on three relation extraction benchmark datasets demonstrate that our method can improve the BERT model representation and achieve state-of-the-art performance. In addition, we explore the interpretability of models by showing that BERT with contrastive pre-training relies more on rationales for prediction. Our code and data are publicly available at: https://github.com/AnonymousForNow.
Anthology ID:
2021.bionlp-1.1
Volume:
Proceedings of the 20th Workshop on Biomedical Language Processing
Month:
June
Year:
2021
Address:
Online
Venues:
BioNLP | NAACL
SIG:
SIGBIOMED
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–10
Language:
URL:
https://aclanthology.org/2021.bionlp-1.1
DOI:
10.18653/v1/2021.bionlp-1.1
Bibkey:
Cite (ACL):
Peng Su, Yifan Peng, and K. Vijay-Shanker. 2021. Improving BERT Model Using Contrastive Learning for Biomedical Relation Extraction. In Proceedings of the 20th Workshop on Biomedical Language Processing, pages 1–10, Online. Association for Computational Linguistics.
Cite (Informal):
Improving BERT Model Using Contrastive Learning for Biomedical Relation Extraction (Su et al., BioNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.bionlp-1.1.pdf
Code
 udel-biotm-lab/BERT-CLRE