Deep Multitask Learning for Semantic Dependency Parsing

Hao Peng, Sam Thomson, Noah A. Smith


Abstract
We present a deep neural architecture that parses sentences into three semantic dependency graph formalisms. By using efficient, nearly arc-factored inference and a bidirectional-LSTM composed with a multi-layer perceptron, our base system is able to significantly improve the state of the art for semantic dependency parsing, without using hand-engineered features or syntax. We then explore two multitask learning approaches—one that shares parameters across formalisms, and one that uses higher-order structures to predict the graphs jointly. We find that both approaches improve performance across formalisms on average, achieving a new state of the art. Our code is open-source and available at https://github.com/Noahs-ARK/NeurboParser.
Anthology ID:
P17-1186
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2037–2048
Language:
URL:
https://aclanthology.org/P17-1186
DOI:
10.18653/v1/P17-1186
Bibkey:
Cite (ACL):
Hao Peng, Sam Thomson, and Noah A. Smith. 2017. Deep Multitask Learning for Semantic Dependency Parsing. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2037–2048, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Deep Multitask Learning for Semantic Dependency Parsing (Peng et al., ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-1186.pdf
Code
 Noahs-ARK/NeurboParser