Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior

Zi Lin, Jeremiah Liu, Zi Yang, Nan Hua, Dan Roth


Abstract
Traditional (unstructured) pruning methods for a Transformer model focus on regularizing the individual weights by penalizing them toward zero. In this work, we explore spectral-normalized identity priors (SNIP), a structured pruning approach which penalizes an entire residual module in a Transformer model toward an identity mapping. Our method identifies and discards unimportant non-linear mappings in the residual connections by applying a thresholding operator on the function norm, and is applicable to any structured module including a single attention head, an entire attention blocks, or a feed-forward subnetwork. Furthermore, we introduce spectral normalization to stabilize the distribution of the post-activation values of the Transformer layers, further improving the pruning effectiveness of the proposed methodology. We conduct experiments with BERT on 5 GLUE benchmark tasks to demonstrate that SNIP achieves effective pruning results while maintaining comparable performance. Specifically, we improve the performance over the state-of-the-art by 0.5 to 1.0% on average at 50% compression ratio.
Anthology ID:
2020.findings-emnlp.64
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
719–730
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.64
DOI:
10.18653/v1/2020.findings-emnlp.64
Bibkey:
Cite (ACL):
Zi Lin, Jeremiah Liu, Zi Yang, Nan Hua, and Dan Roth. 2020. Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 719–730, Online. Association for Computational Linguistics.
Cite (Informal):
Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior (Lin et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.64.pdf
Video:
 https://slideslive.com/38940112
Code
 google-research/google-research
Data
GLUEMRPCMultiNLIQNLISSTSST-2