%0 Conference Proceedings %T Fine-grained domain classification using Transformers %A Gahoi, Akshat %A Chhajer, Akshat %A Mishra Sharma, Dipti %Y Sharma, Dipti Misra %Y Ekbal, Asif %Y Arora, Karunesh %Y Naskar, Sudip Kumar %Y Ganguly, Dipankar %Y L, Sobha %Y Mamidi, Radhika %Y Arora, Sunita %Y Mishra, Pruthwik %Y Mujadia, Vandan %S Proceedings of the 17th International Conference on Natural Language Processing (ICON): TechDOfication 2020 Shared Task %D 2020 %8 December %I NLP Association of India (NLPAI) %C Patna, India %F gahoi-etal-2020-fine %X The introduction of transformers in 2017 and successively BERT in 2018 brought about a revolution in the field of natural language processing. Such models are pretrained on vast amounts of data, and are easily extensible to be used for a wide variety of tasks through transfer learning. Continual work on transformer based architectures has led to a variety of new models with state of the art results. RoBERTa(CITATION) is one such model, which brings about a series of changes to the BERT architecture and is capable of producing better quality embeddings at an expense of functionality. In this paper, we attempt to solve the well known text classification task of fine-grained domain classification using BERT and RoBERTa and perform a comparative analysis of the same. We also attempt to evaluate the impact of data preprocessing specially in the context of fine-grained domain classification. The results obtained outperformed all the other models at the ICON TechDOfication 2020 (subtask-2a) Fine-grained domain classification task and ranked first. This proves the effectiveness of our approach. %U https://aclanthology.org/2020.icon-techdofication.7 %P 31-34