Integration of Heterogeneous Knowledge Sources for Biomedical Text Processing

Parsa Bagherzadeh, Sabine Bergler


Abstract
Recently, research into bringing outside knowledge sources into current neural NLP models has been increasing. Most approaches that leverage external knowledge sources require laborious and non-trivial designs, as well as tailoring the system through intensive ablation of different knowledge sources, an effort that discourages users to use quality ontological resources. In this paper, we show that multiple large heterogeneous KSs can be easily integrated using a decoupled approach, allowing for an automatic ablation of irrelevant KSs, while keeping the overall parameter space tractable. We experiment with BERT and pre-trained graph embeddings, and show that they interoperate well without performance degradation, even when some do not contribute to the task.
Anthology ID:
2022.louhi-1.25
Volume:
Proceedings of the 13th International Workshop on Health Text Mining and Information Analysis (LOUHI)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Alberto Lavelli, Eben Holderness, Antonio Jimeno Yepes, Anne-Lyse Minard, James Pustejovsky, Fabio Rinaldi
Venue:
Louhi
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
229–238
Language:
URL:
https://aclanthology.org/2022.louhi-1.25
DOI:
10.18653/v1/2022.louhi-1.25
Bibkey:
Cite (ACL):
Parsa Bagherzadeh and Sabine Bergler. 2022. Integration of Heterogeneous Knowledge Sources for Biomedical Text Processing. In Proceedings of the 13th International Workshop on Health Text Mining and Information Analysis (LOUHI), pages 229–238, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Integration of Heterogeneous Knowledge Sources for Biomedical Text Processing (Bagherzadeh & Bergler, Louhi 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.louhi-1.25.pdf
Video:
 https://aclanthology.org/2022.louhi-1.25.mp4