Combining Global Models for Parsing Universal Dependencies

Tianze Shi, Felix G. Wu, Xilun Chen, Yao Cheng


Abstract
We describe our entry, C2L2, to the CoNLL 2017 shared task on parsing Universal Dependencies from raw text. Our system features an ensemble of three global parsing paradigms, one graph-based and two transition-based. Each model leverages character-level bi-directional LSTMs as lexical feature extractors to encode morphological information. Though relying on baseline tokenizers and focusing only on parsing, our system ranked second in the official end-to-end evaluation with a macro-average of 75.00 LAS F1 score over 81 test treebanks. In addition, we had the top average performance on the four surprise languages and on the small treebank subset.
Anthology ID:
K17-3003
Volume:
Proceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies
Month:
August
Year:
2017
Address:
Vancouver, Canada
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
31–39
Language:
URL:
https://aclanthology.org/K17-3003
DOI:
10.18653/v1/K17-3003
Bibkey:
Cite (ACL):
Tianze Shi, Felix G. Wu, Xilun Chen, and Yao Cheng. 2017. Combining Global Models for Parsing Universal Dependencies. In Proceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, pages 31–39, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Combining Global Models for Parsing Universal Dependencies (Shi et al., CoNLL 2017)
Copy Citation:
PDF:
https://aclanthology.org/K17-3003.pdf
Data
Universal Dependencies