An Exploratory Study on Model Compression for Text-to-SQL

Shuo Sun, Yuze Gao, Yuchen Zhang, Jian Su, Bin Chen, Yingzhan Lin, Shuqi Sun


Abstract
Text-to-SQL translates user queries into SQL statements that can retrieve relevant answers from relational databases. Recent approaches to Text-to-SQL rely on pre-trained language models that are computationally expensive and technically challenging to deploy in real-world applications that require real-time or on-device processing capabilities. In this paper, we perform a focused study on the feasibility of applying recent model compression techniques to sketch-based and sequence-to-sequence Text-to-SQL models. Our results reveal that sketch-based Text-to-SQL models generally have higher inference efficiency and respond better to model compression than sequence-to-sequence models, making them ideal for real-world deployments, especially in use cases with simple SQL statements.
Anthology ID:
2023.findings-acl.740
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11647–11654
Language:
URL:
https://aclanthology.org/2023.findings-acl.740
DOI:
10.18653/v1/2023.findings-acl.740
Bibkey:
Cite (ACL):
Shuo Sun, Yuze Gao, Yuchen Zhang, Jian Su, Bin Chen, Yingzhan Lin, and Shuqi Sun. 2023. An Exploratory Study on Model Compression for Text-to-SQL. In Findings of the Association for Computational Linguistics: ACL 2023, pages 11647–11654, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
An Exploratory Study on Model Compression for Text-to-SQL (Sun et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.740.pdf