Developing a Clinical Language Model for Swedish: Continued Pretraining of Generic BERT with In-Domain Data

Anastasios Lamproudis, Aron Henriksson, Hercules Dalianis


Abstract
The use of pretrained language models, fine-tuned to perform a specific downstream task, has become widespread in NLP. Using a generic language model in specialized domains may, however, be sub-optimal due to differences in language use and vocabulary. In this paper, it is investigated whether an existing, generic language model for Swedish can be improved for the clinical domain through continued pretraining with clinical text. The generic and domain-specific language models are fine-tuned and evaluated on three representative clinical NLP tasks: (i) identifying protected health information, (ii) assigning ICD-10 diagnosis codes to discharge summaries, and (iii) sentence-level uncertainty prediction. The results show that continued pretraining on in-domain data leads to improved performance on all three downstream tasks, indicating that there is a potential added value of domain-specific language models for clinical NLP.
Anthology ID:
2021.ranlp-1.90
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021)
Month:
September
Year:
2021
Address:
Held Online
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
790–797
Language:
URL:
https://aclanthology.org/2021.ranlp-1.90
DOI:
Bibkey:
Cite (ACL):
Anastasios Lamproudis, Aron Henriksson, and Hercules Dalianis. 2021. Developing a Clinical Language Model for Swedish: Continued Pretraining of Generic BERT with In-Domain Data. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 790–797, Held Online. INCOMA Ltd..
Cite (Informal):
Developing a Clinical Language Model for Swedish: Continued Pretraining of Generic BERT with In-Domain Data (Lamproudis et al., RANLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.ranlp-1.90.pdf