%0 Conference Proceedings %T Multi-Task Supervised Pretraining for Neural Domain Adaptation %A Meftah, Sara %A Semmar, Nasredine %A Tahiri, Mohamed-Ayoub %A Tamaazousti, Youssef %A Essafi, Hassane %A Sadat, Fatiha %Y Ku, Lun-Wei %Y Li, Cheng-Te %S Proceedings of the Eighth International Workshop on Natural Language Processing for Social Media %D 2020 %8 July %I Association for Computational Linguistics %C Online %F meftah-etal-2020-multi %X Two prevalent transfer learning approaches are used in recent works to improve neural networks performance for domains with small amounts of annotated data: Multi-task learning which involves training the task of interest with related auxiliary tasks to exploit their underlying similarities, and Mono-task fine-tuning, where the weights of the model are initialized with the pretrained weights of a large-scale labeled source domain and then fine-tuned with labeled data of the target domain (domain of interest). In this paper, we propose a new approach which takes advantage from both approaches by learning a hierarchical model trained across multiple tasks from a source domain, and is then fine-tuned on multiple tasks of the target domain. Our experiments on four tasks applied to the social media domain show that our proposed approach leads to significant improvements on all tasks compared to both approaches. %R 10.18653/v1/2020.socialnlp-1.8 %U https://aclanthology.org/2020.socialnlp-1.8 %U https://doi.org/10.18653/v1/2020.socialnlp-1.8 %P 61-71