FastAdaSP: Multitask-Adapted Efficient Inference for Large Speech Language Model

Yichen Lu, Jiaqi Song, Chao-Han Huck Yang, Shinji Watanabe


Abstract
In this study, we aim to explore Multitask Speech Language Model (SpeechLM) efficient inference via token reduction. Unlike other modalities such as vision or text, speech has unique temporal dependencies, making previous efficient inference works on other modalities not directly applicable. Furthermore, methods for efficient SpeechLM inference on long sequence and sparse signals remain largely unexplored. In this work, we propose FastAdaSP, a weighted token merging framework specifically designed for various speech-related tasks to improve the trade-off between efficiency and performance. Experimental results on WavLLM and Qwen-Audio show that our method achieves the state-of-the-art (SOTA) efficiency-performance trade-off compared with other baseline methods. Specifically, FastAdaSP achieved 7x memory efficiency and 1.83x decoding throughput without any degradation on tasks like Emotion Recognition (ER) and Spoken Question Answering (SQA).
Anthology ID:
2024.emnlp-industry.33
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2024
Address:
Miami, Florida, US
Editors:
Franck Dernoncourt, Daniel Preoţiuc-Pietro, Anastasia Shimorina
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
440–451
Language:
URL:
https://aclanthology.org/2024.emnlp-industry.33
DOI:
Bibkey:
Cite (ACL):
Yichen Lu, Jiaqi Song, Chao-Han Huck Yang, and Shinji Watanabe. 2024. FastAdaSP: Multitask-Adapted Efficient Inference for Large Speech Language Model. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 440–451, Miami, Florida, US. Association for Computational Linguistics.
Cite (Informal):
FastAdaSP: Multitask-Adapted Efficient Inference for Large Speech Language Model (Lu et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-industry.33.pdf
Poster:
 2024.emnlp-industry.33.poster.pdf
Presentation:
 2024.emnlp-industry.33.presentation.pdf