%0 Conference Proceedings %T T2NER: Transformers based Transfer Learning Framework for Named Entity Recognition %A Amin, Saadullah %A Neumann, Guenter %Y Gkatzia, Dimitra %Y Seddah, Djamé %S Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations %D 2021 %8 April %I Association for Computational Linguistics %C Online %F amin-neumann-2021-t2ner %X Recent advances in deep transformer models have achieved state-of-the-art in several natural language processing (NLP) tasks, whereas named entity recognition (NER) has traditionally benefited from long-short term memory (LSTM) networks. In this work, we present a Transformers based Transfer Learning framework for Named Entity Recognition (T2NER) created in PyTorch for the task of NER with deep transformer models. The framework is built upon the Transformers library as the core modeling engine and supports several transfer learning scenarios from sequential transfer to domain adaptation, multi-task learning, and semi-supervised learning. It aims to bridge the gap between the algorithmic advances in these areas by combining them with the state-of-the-art in transformer models to provide a unified platform that is readily extensible and can be used for both the transfer learning research in NER, and for real-world applications. The framework is available at: https://github.com/suamin/t2ner. %R 10.18653/v1/2021.eacl-demos.25 %U https://aclanthology.org/2021.eacl-demos.25 %U https://doi.org/10.18653/v1/2021.eacl-demos.25 %P 212-220