Incorporating Compositionality and Morphology into End-to-End Models

Emily Pitler


Abstract
Many neural end-to-end systems today do not rely on syntactic parse trees, as much of the information that parse trees provide is encoded in the parameters of pretrained models. Lessons learned from parsing technologies and from taking a multilingual perspective, however, are still relevant even for end-to-end models. This talk will describe work that relies on compositionality in semantic parsing and in reading comprehension requiring numerical reasoning. We’ll then describe a new dataset that requires advances in multilingual modeling, and some approaches designed to better model morphology than off-the-shelf subword models that make some progress on these challenges.
Anthology ID:
2021.iwpt-1.14
Volume:
Proceedings of the 17th International Conference on Parsing Technologies and the IWPT 2021 Shared Task on Parsing into Enhanced Universal Dependencies (IWPT 2021)
Month:
August
Year:
2021
Address:
Online
Editors:
Stephan Oepen, Kenji Sagae, Reut Tsarfaty, Gosse Bouma, Djamé Seddah, Daniel Zeman
Venue:
IWPT
SIG:
SIGPARSE
Publisher:
Association for Computational Linguistics
Note:
Pages:
145
Language:
URL:
https://aclanthology.org/2021.iwpt-1.14
DOI:
10.18653/v1/2021.iwpt-1.14
Bibkey:
Cite (ACL):
Emily Pitler. 2021. Incorporating Compositionality and Morphology into End-to-End Models. In Proceedings of the 17th International Conference on Parsing Technologies and the IWPT 2021 Shared Task on Parsing into Enhanced Universal Dependencies (IWPT 2021), page 145, Online. Association for Computational Linguistics.
Cite (Informal):
Incorporating Compositionality and Morphology into End-to-End Models (Pitler, IWPT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.iwpt-1.14.pdf