Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling

Liyuan Liu, Xiang Ren, Jingbo Shang, Xiaotao Gu, Jian Peng, Jiawei Han


Abstract
Many efforts have been made to facilitate natural language processing tasks with pre-trained language models (LMs), and brought significant improvements to various applications. To fully leverage the nearly unlimited corpora and capture linguistic information of multifarious levels, large-size LMs are required; but for a specific task, only parts of these information are useful. Such large-sized LMs, even in the inference stage, may cause heavy computation workloads, making them too time-consuming for large-scale applications. Here we propose to compress bulky LMs while preserving useful information with regard to a specific task. As different layers of the model keep different information, we develop a layer selection method for model pruning using sparsity-inducing regularization. By introducing the dense connectivity, we can detach any layer without affecting others, and stretch shallow and wide LMs to be deep and narrow. In model training, LMs are learned with layer-wise dropouts for better robustness. Experiments on two benchmark datasets demonstrate the effectiveness of our method.
Anthology ID:
D18-1153
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1215–1225
Language:
URL:
https://aclanthology.org/D18-1153
DOI:
10.18653/v1/D18-1153
Bibkey:
Cite (ACL):
Liyuan Liu, Xiang Ren, Jingbo Shang, Xiaotao Gu, Jian Peng, and Jiawei Han. 2018. Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 1215–1225, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling (Liu et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1153.pdf
Video:
 https://aclanthology.org/D18-1153.mp4
Code
 LiyuanLucasLiu/LD-Net
Data
CoNLLCoNLL 2003