Improving Grammar-based Sequence-to-Sequence Modeling with Decomposition and Constraints

Chao Lou, Kewei Tu


Abstract
Neural QCFG is a grammar-based sequence-to-sequence model with strong inductive biases on hierarchical structures. It excels in interpretability and generalization but suffers from expensive inference. In this paper, we study two low-rank variants of Neural QCFG for faster inference with different trade-offs between efficiency and expressiveness. Furthermore, utilizing the symbolic interface provided by the grammar, we introduce two soft constraints over tree hierarchy and source coverage. We experiment with various datasets and find that our models outperform vanilla Neural QCFG in most settings.
Anthology ID:
2023.acl-short.163
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1918–1929
Language:
URL:
https://aclanthology.org/2023.acl-short.163
DOI:
10.18653/v1/2023.acl-short.163
Bibkey:
Cite (ACL):
Chao Lou and Kewei Tu. 2023. Improving Grammar-based Sequence-to-Sequence Modeling with Decomposition and Constraints. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1918–1929, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Improving Grammar-based Sequence-to-Sequence Modeling with Decomposition and Constraints (Lou & Tu, ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.163.pdf
Video:
 https://aclanthology.org/2023.acl-short.163.mp4