Zhenqi Zhu


2019

pdf bib
Deconstructing Supertagging into Multi-Task Sequence Prediction
Zhenqi Zhu | Anoop Sarkar
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)

Supertagging is a sequence prediction task where each word is assigned a piece of complex syntactic structure called a supertag. We provide a novel approach to multi-task learning for Tree Adjoining Grammar (TAG) supertagging by deconstructing these complex supertags in order to define a set of related but auxiliary sequence prediction tasks. Our multi-task prediction framework is trained over the exactly same training data used to train the original supertagger where each auxiliary task provides an alternative view on the original prediction task. Our experimental results show that our multi-task approach significantly improves TAG supertagging with a new state-of-the-art accuracy score of 91.39% on the Penn Treebank supertagging dataset.
Search
Co-authors
Venues