Eliyahu Kiperwasser


2018

pdf bib
Scheduled Multi-Task Learning: From Syntax to Translation
Eliyahu Kiperwasser | Miguel Ballesteros
Transactions of the Association for Computational Linguistics, Volume 6

Neural encoder-decoder models of machine translation have achieved impressive results, while learning linguistic knowledge of both the source and target languages in an implicit end-to-end manner. We propose a framework in which our model begins learning syntax and translation interleaved, gradually putting more focus on translation. Using this approach, we achieve considerable improvements in terms of BLEU score on relatively large parallel corpus (WMT14 English to German) and a low-resource (WIT German to English) setup.

2017

pdf bib
From Raw Text to Universal Dependencies - Look, No Tags!
Miryam de Lhoneux | Yan Shao | Ali Basirat | Eliyahu Kiperwasser | Sara Stymne | Yoav Goldberg | Joakim Nivre
Proceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies

We present the Uppsala submission to the CoNLL 2017 shared task on parsing from raw text to universal dependencies. Our system is a simple pipeline consisting of two components. The first performs joint word and sentence segmentation on raw text; the second predicts dependency trees from raw words. The parser bypasses the need for part-of-speech tagging, but uses word embeddings based on universal tag distributions. We achieved a macro-averaged LAS F1 of 65.11 in the official test run, which improved to 70.49 after bug fixes. We obtained the 2nd best result for sentence segmentation with a score of 89.03.

2016

pdf bib
Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations
Eliyahu Kiperwasser | Yoav Goldberg
Transactions of the Association for Computational Linguistics, Volume 4

We present a simple and effective scheme for dependency parsing which is based on bidirectional-LSTMs (BiLSTMs). Each sentence token is associated with a BiLSTM vector representing the token in its sentential context, and feature vectors are constructed by concatenating a few BiLSTM vectors. The BiLSTM is trained jointly with the parser objective, resulting in very effective feature extractors for parsing. We demonstrate the effectiveness of the approach by applying it to a greedy transition-based parser as well as to a globally optimized graph-based parser. The resulting parsers have very simple architectures, and match or surpass the state-of-the-art accuracies on English and Chinese.

pdf bib
Easy-First Dependency Parsing with Hierarchical Tree LSTMs
Eliyahu Kiperwasser | Yoav Goldberg
Transactions of the Association for Computational Linguistics, Volume 4

We suggest a compositional vector representation of parse trees that relies on a recursive combination of recurrent-neural network encoders. To demonstrate its effectiveness, we use the representation as the backbone of a greedy, bottom-up dependency parser, achieving very strong accuracies for English and Chinese, without relying on external word embeddings. The parser’s implementation is available for download at the first author’s webpage.

2015

pdf bib
Semi-supervised Dependency Parsing using Bilexical Contextual Features from Auto-Parsed Data
Eliyahu Kiperwasser | Yoav Goldberg
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing