PruMUX: Augmenting Data Multiplexing with Model Compression

Yushan Su, Vishvak Murahari, Karthik Narasimhan, Kai Li


Abstract
As language models increase in size by the day, methods for efficient inference are critical to leveraging their capabilities for various applications. Prior work has investigated techniques like model pruning, knowledge distillation, and data multiplexing to increase model throughput without sacrificing accuracy. In this paper, we combine two such methods – structured pruning and data multiplexing – to compound the speedup gains obtained by either method. Our approach, PruMUX, obtains up to 7.5-29.5X throughput improvement over BERT-base model with accuracy threshold from 80% to 74%. We further study various combinations of parameters (such as sparsity and multiplexing factor) in the two techniques to provide a comprehensive analysis of the tradeoff between accuracy and throughput in the resulting models. We then propose Auto-PruMUX, a meta-level model that can predict the high-performance parameters for pruning and multiplexing given a desired accuracy loss budget, providing a practical method to leverage the combination effectively.
Anthology ID:
2023.findings-acl.841
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13302–13315
Language:
URL:
https://aclanthology.org/2023.findings-acl.841
DOI:
10.18653/v1/2023.findings-acl.841
Bibkey:
Cite (ACL):
Yushan Su, Vishvak Murahari, Karthik Narasimhan, and Kai Li. 2023. PruMUX: Augmenting Data Multiplexing with Model Compression. In Findings of the Association for Computational Linguistics: ACL 2023, pages 13302–13315, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
PruMUX: Augmenting Data Multiplexing with Model Compression (Su et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.841.pdf
Video:
 https://aclanthology.org/2023.findings-acl.841.mp4