MRC-based Medical NER with Multi-task Learning and Multi-strategies

Xiaojing Du, Jia Yuxiang, Zan Hongying


Abstract
“Medical named entity recognition (NER), a fundamental task of medical information extraction, is crucial for medical knowledge graph construction, medical question answering, and automatic medical record analysis, etc. Compared with named entities (NEs) in general domain, medical named entities are usually more complex and prone to be nested. To cope with both flat NEs and nested NEs, we propose a MRC-based approach with multi-task learning and multi-strategies. NER can be treated as a sequence labeling (SL) task or a span boundary detection (SBD) task. We integrate MRC-CRF model for SL and MRC-Biaffine model for SBD into the multi-task learning architecture, and select the more efficient MRC-CRF as the final decoder. To further improve the model, we employ multi-strategies, including adaptive pre-training, adversarial training, and model stacking with cross validation. Experiments on both nested NER corpus CMeEE and flat NER corpus CCKS2019 show the effectiveness of the MRC-based model with multi-task learning and multi-strategies.”
Anthology ID:
2022.ccl-1.74
Volume:
Proceedings of the 21st Chinese National Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Nanchang, China
Editors:
Maosong Sun (孙茂松), Yang Liu (刘洋), Wanxiang Che (车万翔), Yang Feng (冯洋), Xipeng Qiu (邱锡鹏), Gaoqi Rao (饶高琦), Yubo Chen (陈玉博)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
836–847
Language:
English
URL:
https://aclanthology.org/2022.ccl-1.74
DOI:
Bibkey:
Cite (ACL):
Xiaojing Du, Jia Yuxiang, and Zan Hongying. 2022. MRC-based Medical NER with Multi-task Learning and Multi-strategies. In Proceedings of the 21st Chinese National Conference on Computational Linguistics, pages 836–847, Nanchang, China. Chinese Information Processing Society of China.
Cite (Informal):
MRC-based Medical NER with Multi-task Learning and Multi-strategies (Du et al., CCL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.ccl-1.74.pdf
Data
CMeEE