The Microsoft Toolkit of Multi-Task Deep Neural Networks for Natural Language Understanding

Xiaodong Liu, Yu Wang, Jianshu Ji, Hao Cheng, Xueyun Zhu, Emmanuel Awa, Pengcheng He, Weizhu Chen, Hoifung Poon, Guihong Cao, Jianfeng Gao


Abstract
We present MT-DNN, an open-source natural language understanding (NLU) toolkit that makes it easy for researchers and developers to train customized deep learning models. Built upon PyTorch and Transformers, MT-DNN is designed to facilitate rapid customization for a broad spectrum of NLU tasks, using a variety of objectives (classification, regression, structured prediction) and text encoders (e.g., RNNs, BERT, RoBERTa, UniLM). A unique feature of MT-DNN is its built-in support for robust and transferable learning using the adversarial multi-task learning paradigm. To enable efficient production deployment, MT-DNN supports multi-task knowledge distillation, which can substantially compress a deep neural model without significant performance drop. We demonstrate the effectiveness of MT-DNN on a wide range of NLU applications across general and biomedical domains. The software and pre-trained models will be publicly available at https://github.com/namisan/mt-dnn.
Anthology ID:
2020.acl-demos.16
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations
Month:
July
Year:
2020
Address:
Online
Editors:
Asli Celikyilmaz, Tsung-Hsien Wen
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
118–126
Language:
URL:
https://aclanthology.org/2020.acl-demos.16
DOI:
10.18653/v1/2020.acl-demos.16
Bibkey:
Cite (ACL):
Xiaodong Liu, Yu Wang, Jianshu Ji, Hao Cheng, Xueyun Zhu, Emmanuel Awa, Pengcheng He, Weizhu Chen, Hoifung Poon, Guihong Cao, and Jianfeng Gao. 2020. The Microsoft Toolkit of Multi-Task Deep Neural Networks for Natural Language Understanding. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations, pages 118–126, Online. Association for Computational Linguistics.
Cite (Informal):
The Microsoft Toolkit of Multi-Task Deep Neural Networks for Natural Language Understanding (Liu et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-demos.16.pdf
Video:
 http://slideslive.com/38928611
Code
 namisan/mt-dnn +  additional community code
Data
ANLIGLUEMRPCMultiNLIQNLISNLISQuADSST