Sid Wang
2023
Treepiece: Faster Semantic Parsing via Tree Tokenization
Sid Wang
|
Akshat Shrivastava
|
Aleksandr Livshits
Findings of the Association for Computational Linguistics: EMNLP 2023
Autoregressive (AR) encoder-decoder neural networks have proved successful in many NLP problems, including Semantic Parsing – a task that translates natural language to machine-readable parse trees. However, the sequential prediction process of AR models can be slow. To accelerate AR for semantic parsing, we introduce a new technique called TreePiece that tokenizes a parse tree into subtrees and generates one subtree per decoding step. On TOPv2 benchmark, TreePiece shows 4.6 times faster decoding speed than standard AR, and comparable speed but significantly higher accuracy compared to Non-Autoregressive (NAR).
Search