Compressing Transformer-Based Semantic Parsing Models using Compositional Code Embeddings

Prafull Prakash, Saurabh Kumar Shashidhar, Wenlong Zhao, Subendhu Rongali, Haidar Khan, Michael Kayser


Abstract
The current state-of-the-art task-oriented semantic parsing models use BERT or RoBERTa as pretrained encoders; these models have huge memory footprints. This poses a challenge to their deployment for voice assistants such as Amazon Alexa and Google Assistant on edge devices with limited memory budgets. We propose to learn compositional code embeddings to greatly reduce the sizes of BERT-base and RoBERTa-base. We also apply the technique to DistilBERT, ALBERT-base, and ALBERT-large, three already compressed BERT variants which attain similar state-of-the-art performances on semantic parsing with much smaller model sizes. We observe 95.15% 98.46% embedding compression rates and 20.47% 34.22% encoder compression rates, while preserving >97.5% semantic parsing performances. We provide the recipe for training and analyze the trade-off between code embedding sizes and downstream performances.
Anthology ID:
2020.findings-emnlp.423
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4711–4717
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.423
DOI:
10.18653/v1/2020.findings-emnlp.423
Bibkey:
Cite (ACL):
Prafull Prakash, Saurabh Kumar Shashidhar, Wenlong Zhao, Subendhu Rongali, Haidar Khan, and Michael Kayser. 2020. Compressing Transformer-Based Semantic Parsing Models using Compositional Code Embeddings. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4711–4717, Online. Association for Computational Linguistics.
Cite (Informal):
Compressing Transformer-Based Semantic Parsing Models using Compositional Code Embeddings (Prakash et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.423.pdf
Optional supplementary material:
 2020.findings-emnlp.423.OptionalSupplementaryMaterial.zip
Data
SNIPS