Recurrent models and lower bounds for projective syntactic decoding

Natalie Schluter


Abstract
The current state-of-the-art in neural graph-based parsing uses only approximate decoding at the training phase. In this paper aim to understand this result better. We show how recurrent models can carry out projective maximum spanning tree decoding. This result holds for both current state-of-the-art models for shift-reduce and graph-based parsers, projective or not. We also provide the first proof on the lower bounds of projective maximum spanning tree decoding.
Anthology ID:
N19-1022
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
251–260
Language:
URL:
https://aclanthology.org/N19-1022
DOI:
10.18653/v1/N19-1022
Bibkey:
Cite (ACL):
Natalie Schluter. 2019. Recurrent models and lower bounds for projective syntactic decoding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 251–260, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Recurrent models and lower bounds for projective syntactic decoding (Schluter, NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/N19-1022.pdf