%0 Conference Proceedings %T MedAI at SemEval-2021 Task 10: Negation-aware Pre-training for Source-free Negation Detection Domain Adaptation %A Sun, Jinquan %A Zhang, Qi %A Wang, Yu %A Zhang, Lei %Y Palmer, Alexis %Y Schneider, Nathan %Y Schluter, Natalie %Y Emerson, Guy %Y Herbelot, Aurelie %Y Zhu, Xiaodan %S Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval-2021) %D 2021 %8 August %I Association for Computational Linguistics %C Online %F sun-etal-2021-medai %X Due to the increasing concerns for data privacy, source-free unsupervised domain adaptation attracts more and more research attention, where only a trained source model is assumed to be available, while the labeled source data remain private. To get promising adaptation results, we need to find effective ways to transfer knowledge learned in source domain and leverage useful domain specific information from target domain at the same time. This paper describes our winning contribution to SemEval 2021 Task 10: Source-Free Domain Adaptation for Semantic Processing. Our key idea is to leverage the model trained on source domain data to generate pseudo labels for target domain samples. Besides, we propose Negation-aware Pre-training (NAP) to incorporate negation knowledge into model. Our method win the 1st place with F1-score of 0.822 on the official blind test set of Negation Detection Track. %R 10.18653/v1/2021.semeval-1.183 %U https://aclanthology.org/2021.semeval-1.183 %U https://doi.org/10.18653/v1/2021.semeval-1.183 %P 1283-1288