Lattice Transformer for Speech Translation

Pei Zhang, Niyu Ge, Boxing Chen, Kai Fan


Abstract
Recent advances in sequence modeling have highlighted the strengths of the transformer architecture, especially in achieving state-of-the-art machine translation results. However, depending on the up-stream systems, e.g., speech recognition, or word segmentation, the input to translation system can vary greatly. The goal of this work is to extend the attention mechanism of the transformer to naturally consume the lattice in addition to the traditional sequential input. We first propose a general lattice transformer for speech translation where the input is the output of the automatic speech recognition (ASR) which contains multiple paths and posterior scores. To leverage the extra information from the lattice structure, we develop a novel controllable lattice attention mechanism to obtain latent representations. On the LDC Spanish-English speech translation corpus, our experiments show that lattice transformer generalizes significantly better and outperforms both a transformer baseline and a lattice LSTM. Additionally, we validate our approach on the WMT 2017 Chinese-English translation task with lattice inputs from different BPE segmentations. In this task, we also observe the improvements over strong baselines.
Anthology ID:
P19-1649
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6475–6484
Language:
URL:
https://aclanthology.org/P19-1649
DOI:
10.18653/v1/P19-1649
Bibkey:
Cite (ACL):
Pei Zhang, Niyu Ge, Boxing Chen, and Kai Fan. 2019. Lattice Transformer for Speech Translation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 6475–6484, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Lattice Transformer for Speech Translation (Zhang et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1649.pdf