Transformer Semantic Parsing

Gabriela Ferraro, Hanna Suominen


Abstract
In neural semantic parsing, sentences are mapped to meaning representations using encoder-decoder frameworks. In this paper, we propose to apply the Transformer architecture, instead of recurrent neural networks, to this task. Experiments in two data sets from different domains and with different levels of difficulty show that our model achieved better results than strong baselines in certain settings and competitive results across all our experiments.
Anthology ID:
2020.alta-1.16
Volume:
Proceedings of the The 18th Annual Workshop of the Australasian Language Technology Association
Month:
December
Year:
2020
Address:
Virtual Workshop
Venue:
ALTA
SIG:
Publisher:
Australasian Language Technology Association
Note:
Pages:
121–126
Language:
URL:
https://aclanthology.org/2020.alta-1.16
DOI:
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2020.alta-1.16.pdf
Data
Mathematics Dataset