%0 Conference Proceedings %T SJ_AJ@DravidianLangTech-EACL2021: Task-Adaptive Pre-Training of Multilingual BERT models for Offensive Language Identification %A Jayanthi, Sai Muralidhar %A Gupta, Akshat %Y Chakravarthi, Bharathi Raja %Y Priyadharshini, Ruba %Y Kumar M, Anand %Y Krishnamurthy, Parameswari %Y Sherly, Elizabeth %S Proceedings of the First Workshop on Speech and Language Technologies for Dravidian Languages %D 2021 %8 April %I Association for Computational Linguistics %C Kyiv %F jayanthi-gupta-2021-sj %X In this paper we present our submission for the EACL 2021-Shared Task on Offensive Language Identification in Dravidian languages. Our final system is an ensemble of mBERT and XLM-RoBERTa models which leverage task-adaptive pre-training of multilingual BERT models with a masked language modeling objective. Our system was ranked 1st for Kannada, 2nd for Malayalam and 3rd for Tamil. %U https://aclanthology.org/2021.dravidianlangtech-1.44 %P 307-312