Fast and Accurate Reordering with ITG Transition RNN

Hao Zhang, Axel Ng, Richard Sproat


Abstract
Attention-based sequence-to-sequence neural network models learn to jointly align and translate. The quadratic-time attention mechanism is powerful as it is capable of handling arbitrary long-distance reordering, but computationally expensive. In this paper, towards making neural translation both accurate and efficient, we follow the traditional pre-reordering approach to decouple reordering from translation. We add a reordering RNN that shares the input encoder with the decoder. The RNNs are trained jointly with a multi-task loss function and applied sequentially at inference time. The task of the reordering model is to predict the permutation of the input words following the target language word order. After reordering, the attention in the decoder becomes more peaked and monotonic. For reordering, we adopt the Inversion Transduction Grammars (ITG) and propose a transition system to parse input to trees for reordering. We harness the ITG transition system with RNN. With the modeling power of RNN, we achieve superior reordering accuracy without any feature engineering. In experiments, we apply the model to the task of text normalization. Compared to a strong baseline of attention-based RNN, our ITG RNN re-ordering model can reach the same reordering accuracy with only 1/10 of the training data and is 2.5x faster in decoding.
Anthology ID:
C18-1123
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1454–1463
Language:
URL:
https://aclanthology.org/C18-1123
DOI:
Bibkey:
Cite (ACL):
Hao Zhang, Axel Ng, and Richard Sproat. 2018. Fast and Accurate Reordering with ITG Transition RNN. In Proceedings of the 27th International Conference on Computational Linguistics, pages 1454–1463, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Fast and Accurate Reordering with ITG Transition RNN (Zhang et al., COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1123.pdf