Generalizing over Long Tail Concepts for Medical Term Normalization

Beatrice Portelli, Simone Scaboro, Enrico Santus, Hooman Sedghamiz, Emmanuele Chersoni, Giuseppe Serra


Abstract
Medical term normalization consists in mapping a piece of text to a large number of output classes. Given the small size of the annotated datasets and the extremely long tail distribution of the concepts, it is of utmost importance to develop models that are capable to generalize to scarce or unseen concepts. An important attribute of most target ontologies is their hierarchical structure. In this paper we introduce a simple and effective learning strategy that leverages such information to enhance the generalizability of both discriminative and generative models. The evaluation shows that the proposed strategy produces state-of-the-art performance on seen concepts and consistent improvements on unseen ones, allowing also for efficient zero-shot knowledge transfer across text typologies and datasets.
Anthology ID:
2022.emnlp-main.588
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8580–8591
Language:
URL:
https://aclanthology.org/2022.emnlp-main.588
DOI:
10.18653/v1/2022.emnlp-main.588
Bibkey:
Cite (ACL):
Beatrice Portelli, Simone Scaboro, Enrico Santus, Hooman Sedghamiz, Emmanuele Chersoni, and Giuseppe Serra. 2022. Generalizing over Long Tail Concepts for Medical Term Normalization. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 8580–8591, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Generalizing over Long Tail Concepts for Medical Term Normalization (Portelli et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.588.pdf