Breaking through Deterministic Barriers: Randomized Pruning Mask Generation and Selection

Jianwei Li, Weizhi Gao, Qi Lei, Dongkuan Xu


Abstract
It is widely acknowledged that large and sparse models have higher accuracy than small and dense models under the same model size constraints. This motivates us to train a large model and then remove its redundant neurons or weights by pruning. Most existing works pruned the networks in a deterministic way, the performance of which solely depends on a single pruning criterion and thus lacks variety. Instead, in this paper, we propose a model pruning strategy that first generates several pruning masks in a designed random way. Subsequently, along with an effective mask-selection rule, the optimal mask is chosen from the pool of mask candidates. To further enhance efficiency, we introduce an early mask evaluation strategy, mitigating the overhead associated with training multiple masks. Our extensive experiments demonstrate that this approach achieves state-of-the-art performance across eight datasets from GLUE, particularly excelling at high levels of sparsity.
Anthology ID:
2023.findings-emnlp.763
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11407–11423
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.763
DOI:
10.18653/v1/2023.findings-emnlp.763
Bibkey:
Cite (ACL):
Jianwei Li, Weizhi Gao, Qi Lei, and Dongkuan Xu. 2023. Breaking through Deterministic Barriers: Randomized Pruning Mask Generation and Selection. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11407–11423, Singapore. Association for Computational Linguistics.
Cite (Informal):
Breaking through Deterministic Barriers: Randomized Pruning Mask Generation and Selection (Li et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.763.pdf