%0 Conference Proceedings %T Triplet-Trained Vector Space and Sieve-Based Search Improve Biomedical Concept Normalization %A Xu, Dongfang %A Bethard, Steven %Y Demner-Fushman, Dina %Y Cohen, Kevin Bretonnel %Y Ananiadou, Sophia %Y Tsujii, Junichi %S Proceedings of the 20th Workshop on Biomedical Language Processing %D 2021 %8 June %I Association for Computational Linguistics %C Online %F xu-bethard-2021-triplet %X Concept normalization, the task of linking textual mentions of concepts to concepts in an ontology, is critical for mining and analyzing biomedical texts. We propose a vector-space model for concept normalization, where mentions and concepts are encoded via transformer networks that are trained via a triplet objective with online hard triplet mining. The transformer networks refine existing pre-trained models, and the online triplet mining makes training efficient even with hundreds of thousands of concepts by sampling training triples within each mini-batch. We introduce a variety of strategies for searching with the trained vector-space model, including approaches that incorporate domain-specific synonyms at search time with no model retraining. Across five datasets, our models that are trained only once on their corresponding ontologies are within 3 points of state-of-the-art models that are retrained for each new domain. Our models can also be trained for each domain, achieving new state-of-the-art on multiple datasets. %R 10.18653/v1/2021.bionlp-1.2 %U https://aclanthology.org/2021.bionlp-1.2 %U https://doi.org/10.18653/v1/2021.bionlp-1.2 %P 11-22