NeuronBlocks: Building Your NLP DNN Models Like Playing Lego

Ming Gong, Linjun Shou, Wutao Lin, Zhijie Sang, Quanjia Yan, Ze Yang, Feixiang Cheng, Daxin Jiang


Abstract
Deep Neural Networks (DNN) have been widely employed in industry to address various Natural Language Processing (NLP) tasks. However, many engineers find it a big overhead when they have to choose from multiple frameworks, compare different types of models, and understand various optimization mechanisms. An NLP toolkit for DNN models with both generality and flexibility can greatly improve the productivity of engineers by saving their learning cost and guiding them to find optimal solutions to their tasks. In this paper, we introduce NeuronBlocks, a toolkit encapsulating a suite of neural network modules as building blocks to construct various DNN models with complex architecture. This toolkit empowers engineers to build, train, and test various NLP models through simple configuration of JSON files. The experiments on several NLP datasets such as GLUE, WikiQA and CoNLL-2003 demonstrate the effectiveness of NeuronBlocks. Code: https://github.com/Microsoft/NeuronBlocks Demo: https://youtu.be/x6cOpVSZcdo
Anthology ID:
D19-3028
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): System Demonstrations
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Sebastian Padó, Ruihong Huang
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
163–168
Language:
URL:
https://aclanthology.org/D19-3028
DOI:
10.18653/v1/D19-3028
Bibkey:
Cite (ACL):
Ming Gong, Linjun Shou, Wutao Lin, Zhijie Sang, Quanjia Yan, Ze Yang, Feixiang Cheng, and Daxin Jiang. 2019. NeuronBlocks: Building Your NLP DNN Models Like Playing Lego. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): System Demonstrations, pages 163–168, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
NeuronBlocks: Building Your NLP DNN Models Like Playing Lego (Gong et al., EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-3028.pdf
Code
 Microsoft/NeuronBlocks +  additional community code
Data
CoNLL 2003GLUEWikiQA