Quoc Khanh Do

Also published as: Quoc-Khanh Do


2016

pdf bib
Apprentissage discriminant de modèles neuronaux pour la traduction automatique [Discriminative training of continuous space translation models]
Quoc-Khanh Do | Alexandre Allauzen | François Yvon
Traitement Automatique des Langues, Volume 57, Numéro 1 : Varia [Varia]

2015

pdf bib
Apprentissage discriminant des modèles continus de traduction
Quoc-Khanh Do | Alexandre Allauzen | François Yvon
Actes de la 22e conférence sur le Traitement Automatique des Langues Naturelles. Articles longs

Alors que les réseaux neuronaux occupent une place de plus en plus importante dans le traitement automatique des langues, les méthodes d’apprentissage actuelles utilisent pour la plupart des critères qui sont décorrélés de l’application. Cet article propose un nouveau cadre d’apprentissage discriminant pour l’estimation des modèles continus de traduction. Ce cadre s’appuie sur la définition d’un critère d’optimisation permettant de prendre en compte d’une part la métrique utilisée pour l’évaluation de la traduction et d’autre part l’intégration de ces modèles au sein des systèmes de traduction automatique. De plus, cette méthode d’apprentissage est comparée aux critères existants d’estimation que sont le maximum de vraisemblance et l’estimation contrastive bruitée. Les expériences menées sur la tâches de traduction des séminaires TED Talks de l’anglais vers le français montrent la pertinence d’un cadre discriminant d’apprentissage, dont les performances restent toutefois très dépendantes du choix d’une stratégie d’initialisation idoine. Nous montrons qu’avec une initialisation judicieuse des gains significatifs en termes de scores BLEU peuvent être obtenus.

pdf bib
A Discriminative Training Procedure for Continuous Translation Models
Quoc-Khanh Do | Alexandre Allauzen | François Yvon
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing

pdf bib
The KIT-LIMSI Translation System for WMT 2015
Thanh-Le Ha | Quoc-Khanh Do | Eunah Cho | Jan Niehues | Alexandre Allauzen | François Yvon | Alex Waibel
Proceedings of the Tenth Workshop on Statistical Machine Translation

pdf bib
LIMSI@WMT’15 : Translation Task
Benjamin Marie | Alexandre Allauzen | Franck Burlot | Quoc-Khanh Do | Julia Ive | Elena Knyazeva | Matthieu Labeau | Thomas Lavergne | Kevin Löser | Nicolas Pécheux | François Yvon
Proceedings of the Tenth Workshop on Statistical Machine Translation

pdf bib
ListNet-based MT Rescoring
Jan Niehues | Quoc Khanh Do | Alexandre Allauzen | Alex Waibel
Proceedings of the Tenth Workshop on Statistical Machine Translation

2014

pdf bib
LIMSI English-French speech translation system
Natalia Segal | Hélène Bonneau-Maynard | Quoc Khanh Do | Alexandre Allauzen | Jean-Luc Gauvain | Lori Lamel | François Yvon
Proceedings of the 11th International Workshop on Spoken Language Translation: Evaluation Campaign

This paper documents the systems developed by LIMSI for the IWSLT 2014 speech translation task (English→French). The main objective of this participation was twofold: adapting different components of the ASR baseline system to the peculiarities of TED talks and improving the machine translation quality on the automatic speech recognition output data. For the latter task, various techniques have been considered: punctuation and number normalization, adaptation to ASR errors, as well as the use of structured output layer neural network models for speech data.

pdf bib
Discriminative adaptation of continuous space translation models
Quoc-Khanh Do | Alexandre Allauzen | François Yvon
Proceedings of the 11th International Workshop on Spoken Language Translation: Papers

In this paper we explore various adaptation techniques for continuous space translation models (CSTMs). We consider the following practical situation: given a large scale, state-of-the-art SMT system containing a CSTM, the task is to adapt the CSTM to a new domain using a (relatively) small in-domain parallel corpus. Our method relies on the definition of a new discriminative loss function for the CSTM that borrows from both the max-margin and pair-wise ranking approaches. In our experiments, the baseline out-of-domain SMT system is initially trained for the WMT News translation task, and the CSTM is to be adapted to the lecture translation task as defined by IWSLT evaluation campaign. Experimental results show that an improvement of 1.5 BLEU points can be achieved with the proposed adaptation method.

pdf bib
The KIT-LIMSI Translation System for WMT 2014
Quoc Khanh Do | Teresa Herrmann | Jan Niehues | Alexander Allauzen | François Yvon | Alex Waibel
Proceedings of the Ninth Workshop on Statistical Machine Translation

pdf bib
LIMSI @ WMT’14 Medical Translation Task
Nicolas Pécheux | Li Gong | Quoc Khanh Do | Benjamin Marie | Yulia Ivanishcheva | Alexander Allauzen | Thomas Lavergne | Jan Niehues | Aurélien Max | François Yvon
Proceedings of the Ninth Workshop on Statistical Machine Translation

pdf bib
Comparison of scheduling methods for the learning rate of neural network language models (Modèles de langue neuronaux: une comparaison de plusieurs stratégies d’apprentissage) [in French]
Quoc-Khanh Do | Alexandre Allauzen | François Yvon
Proceedings of TALN 2014 (Volume 1: Long Papers)

2013

pdf bib
LIMSI @ WMT13
Alexander Allauzen | Nicolas Pécheux | Quoc Khanh Do | Marco Dinarelli | Thomas Lavergne | Aurélien Max | Hai-Son Le | François Yvon
Proceedings of the Eighth Workshop on Statistical Machine Translation

pdf bib
Joint WMT 2013 Submission of the QUAERO Project
Stephan Peitz | Saab Mansour | Matthias Huck | Markus Freitag | Hermann Ney | Eunah Cho | Teresa Herrmann | Mohammed Mediani | Jan Niehues | Alex Waibel | Alexander Allauzen | Quoc Khanh Do | Bianka Buschbeck | Tonio Wandmacher
Proceedings of the Eighth Workshop on Statistical Machine Translation