RevMUX: Data Multiplexing with Reversible Adapters for Efficient LLM Batch Inference

Yige Xu, Xu Guo, Zhiwei Zeng, Chunyan Miao


Abstract
Large language models (LLMs) have brought a great breakthrough to the natural language processing (NLP) community, while leading the challenge of handling concurrent customer queries due to their high throughput demands. Data multiplexing addresses this by merging multiple inputs into a single composite input, allowing more efficient inference through a shared forward pass. However, as distinguishing individuals from a composite input is challenging, conventional methods typically require training the entire backbone, yet still suffer from performance degradation. In this paper, we introduce RevMUX, a parameter-efficient data multiplexing framework that incorporates a reversible design in the multiplexer, which can be reused by the demultiplexer to perform reverse operations and restore individual samples for classification. Extensive experiments on four datasets and three types of LLM backbones demonstrate the effectiveness of RevMUX for enhancing LLM inference efficiency while retaining a satisfactory classification performance.
Anthology ID:
2024.emnlp-main.1232
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22072–22087
Language:
URL:
https://aclanthology.org/2024.emnlp-main.1232
DOI:
Bibkey:
Cite (ACL):
Yige Xu, Xu Guo, Zhiwei Zeng, and Chunyan Miao. 2024. RevMUX: Data Multiplexing with Reversible Adapters for Efficient LLM Batch Inference. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 22072–22087, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
RevMUX: Data Multiplexing with Reversible Adapters for Efficient LLM Batch Inference (Xu et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.1232.pdf