Improving Biomedical Pretrained Language Models with Knowledge

Zheng Yuan, Yijia Liu, Chuanqi Tan, Songfang Huang, Fei Huang


Abstract
Pretrained language models have shown success in many natural language processing tasks. Many works explore to incorporate the knowledge into the language models. In the biomedical domain, experts have taken decades of effort on building large-scale knowledge bases. For example, UMLS contains millions of entities with their synonyms and defines hundreds of relations among entities. Leveraging this knowledge can benefit a variety of downstream tasks such as named entity recognition and relation extraction. To this end, we propose KeBioLM, a biomedical pretrained language model that explicitly leverages knowledge from the UMLS knowledge bases. Specifically, we extract entities from PubMed abstracts and link them to UMLS. We then train a knowledge-aware language model that firstly applies a text-only encoding layer to learn entity representation and then applies a text-entity fusion encoding to aggregate entity representation. In addition, we add two training objectives as entity detection and entity linking. Experiments on the named entity recognition and relation extraction tasks from the BLURB benchmark demonstrate the effectiveness of our approach. Further analysis on a collected probing dataset shows that our model has better ability to model medical knowledge.
Anthology ID:
2021.bionlp-1.20
Volume:
Proceedings of the 20th Workshop on Biomedical Language Processing
Month:
June
Year:
2021
Address:
Online
Editors:
Dina Demner-Fushman, Kevin Bretonnel Cohen, Sophia Ananiadou, Junichi Tsujii
Venue:
BioNLP
SIG:
SIGBIOMED
Publisher:
Association for Computational Linguistics
Note:
Pages:
180–190
Language:
URL:
https://aclanthology.org/2021.bionlp-1.20
DOI:
10.18653/v1/2021.bionlp-1.20
Bibkey:
Cite (ACL):
Zheng Yuan, Yijia Liu, Chuanqi Tan, Songfang Huang, and Fei Huang. 2021. Improving Biomedical Pretrained Language Models with Knowledge. In Proceedings of the 20th Workshop on Biomedical Language Processing, pages 180–190, Online. Association for Computational Linguistics.
Cite (Informal):
Improving Biomedical Pretrained Language Models with Knowledge (Yuan et al., BioNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.bionlp-1.20.pdf
Code
 GanjinZero/KeBioLM
Data
BC2GMBC5CDRBLUEChemProtDDIGADJNLPBANCBI Disease