Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling

Shruti Bhosale, Kyra Yee, Sergey Edunov, Michael Auli


Abstract
Pre-training models on vast quantities of unlabeled data has emerged as an effective approach to improving accuracy on many NLP tasks. On the other hand, traditional machine translation has a long history of leveraging unlabeled data through noisy channel modeling. The same idea has recently been shown to achieve strong improvements for neural machine translation. Unfortunately, na ̈ıve noisy channel modeling with modern sequence to sequence models is up to an order of magnitude slower than alternatives. We address this issue by introducing efficient approximations to make inference with the noisy channel approach as fast as strong ensembles while increasing accuracy. We also show that the noisy channel approach can outperform strong pre-training results by achieving a new state of the art on WMT Romanian-English translation.
Anthology ID:
2020.wmt-1.69
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Editors:
Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Yvette Graham, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
584–593
Language:
URL:
https://aclanthology.org/2020.wmt-1.69
DOI:
Bibkey:
Cite (ACL):
Shruti Bhosale, Kyra Yee, Sergey Edunov, and Michael Auli. 2020. Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling. In Proceedings of the Fifth Conference on Machine Translation, pages 584–593, Online. Association for Computational Linguistics.
Cite (Informal):
Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling (Bhosale et al., WMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wmt-1.69.pdf
Video:
 https://slideslive.com/38939568
Code
 pytorch/fairseq
Data
WMT 2016WMT 2016 News