Two Local Models for Neural Constituent Parsing

Zhiyang Teng, Yue Zhang


Abstract
Non-local features have been exploited by syntactic parsers for capturing dependencies between sub output structures. Such features have been a key to the success of state-of-the-art statistical parsers. With the rise of deep learning, however, it has been shown that local output decisions can give highly competitive accuracies, thanks to the power of dense neural input representations that embody global syntactic information. We investigate two conceptually simple local neural models for constituent parsing, which make local decisions to constituent spans and CFG rules, respectively. Consistent with previous findings along the line, our best model gives highly competitive results, achieving the labeled bracketing F1 scores of 92.4% on PTB and 87.3% on CTB 5.1.
Anthology ID:
C18-1011
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
119–132
Language:
URL:
https://aclanthology.org/C18-1011
DOI:
Bibkey:
Cite (ACL):
Zhiyang Teng and Yue Zhang. 2018. Two Local Models for Neural Constituent Parsing. In Proceedings of the 27th International Conference on Computational Linguistics, pages 119–132, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Two Local Models for Neural Constituent Parsing (Teng & Zhang, COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1011.pdf
Code
 zeeeyang/two-local-neural-conparsers