Patrick Simianer


pdf bib
Measuring Immediate Adaptation Performance for Neural Machine Translation
Patrick Simianer | Joern Wuebker | John DeNero
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)

Incremental domain adaptation, in which a system learns from the correct output for each input immediately after making its prediction for that input, can dramatically improve system performance for interactive machine translation. Users of interactive systems are sensitive to the speed of adaptation and how often a system repeats mistakes, despite being corrected. Adaptation is most commonly assessed using corpus-level BLEU- or TER-derived metrics that do not explicitly take adaptation speed into account. We find that these metrics often do not capture immediate adaptation effects, such as zero-shot and one-shot learning of domain-specific lexical items. To this end, we propose new metrics that directly evaluate immediate adaptation performance for machine translation. We use these metrics to choose the most suitable adaptation method from a range of different adaptation techniques for neural machine translation systems.


pdf bib
Compact Personalized Models for Neural Machine Translation
Joern Wuebker | Patrick Simianer | John DeNero
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing

We propose and compare methods for gradient-based domain adaptation of self-attentive neural machine translation models. We demonstrate that a large proportion of model parameters can be frozen during adaptation with minimal or no reduction in translation quality by encouraging structured sparsity in the set of offset tensors during learning via group lasso regularization. We evaluate this technique for both batch and incremental adaptation across multiple data sets and language pairs. Our system architecture–combining a state-of-the-art self-attentive model with compact domain adaptation–provides high quality personalized machine translation that is both space and time efficient.


pdf bib
A Post-editing Interface for Immediate Adaptation in Statistical Machine Translation
Patrick Simianer | Sariya Karimova | Stefan Riezler
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: System Demonstrations

Adaptive machine translation (MT) systems are a promising approach for improving the effectiveness of computer-aided translation (CAT) environments. There is, however, virtually only theoretical work that examines how such a system could be implemented. We present an open source post-editing interface for adaptive statistical MT, which has in-depth monitoring capabilities and excellent expandability, and can facilitate practical studies. To this end, we designed text-based and graphical post-editing interfaces. The graphical interface offers means for displaying and editing a rich view of the MT output. Our translation systems may learn from post-edits using several weight, language model and novel translation model adaptation techniques, in part by exploiting the output of the graphical interface. In a user study we show that using the proposed interface and adaptation methods, reductions in technical effort and time can be achieved.


pdf bib
The Heidelberg University English-German translation system for IWSLT 2015
Laura Jehl | Patrick Simianer | Julian HIrschler | Stefan Riezler
Proceedings of the 12th International Workshop on Spoken Language Translation: Evaluation Campaign


pdf bib
Offline extraction of overlapping phrases for hierarchical phrase-based translation
Sariya Karimova | Patrick Simianer | Stefan Riezler
Proceedings of the 11th International Workshop on Spoken Language Translation: Papers

Standard SMT decoders operate by translating disjoint spans of input words, thus discarding information in form of overlapping phrases that is present at phrase extraction time. The use of overlapping phrases in translation may enhance fluency in positions that would otherwise be phrase boundaries, they may provide additional statistical support for long and rare phrases, and they may generate new phrases that have never been seen in the training data. We show how to extract overlapping phrases offline for hierarchical phrasebased SMT, and how to extract features and tune weights for the new phrases. We find gains of 0.3 − 0.6 BLEU points over discriminatively trained hierarchical phrase-based SMT systems on two datasets for German-to-English translation.

pdf bib
Response-based Learning for Grounded Machine Translation
Stefan Riezler | Patrick Simianer | Carolin Haas
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)


pdf bib
Generative and Discriminative Methods for Online Adaptation in SMT
Katharina Wäschle | Patrick Simianer | Nicola Bertoldi | Stefan Riezler | Marcello Federico
Proceedings of Machine Translation Summit XIV: Papers

pdf bib
Multi-Task Learning for Improved Discriminative Training in SMT
Patrick Simianer | Stefan Riezler
Proceedings of the Eighth Workshop on Statistical Machine Translation

pdf bib
The Heidelberg University machine translation systems for IWSLT2013
Patrick Simianer | Laura Jehl | Stefan Riezler
Proceedings of the 10th International Workshop on Spoken Language Translation: Evaluation Campaign

We present our systems for the machine translation evaluation campaign of the International Workshop on Spoken Language Translation (IWSLT) 2013. We submitted systems for three language directions: German-to-English, Russian-to-English and English-to-Russian. The focus of our approaches lies on effective usage of the in-domain parallel training data. Therefore, we use the training data to tune parameter weights for millions of sparse lexicalized features using efficient parallelized stochastic learning techniques. For German-to-English we incorporate syntax features. We combine all of our systems with large language models. For the systems involving Russian we also incorporate more data into building of the translation models.


pdf bib
Joint Feature Selection in Distributed Stochastic Learning for Large-Scale Discriminative Training in SMT
Patrick Simianer | Stefan Riezler | Chris Dyer
Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)