On Target Representation in Continuous-output Neural Machine Translation

Evgeniia Tokarchuk, Vlad Niculae


Abstract
Continuous generative models proved their usefulness in high-dimensional data, such as image and audio generation. However, continuous models for text generation have received limited attention from the community. In this work, we study continuous text generation using Transformers for neural machine translation (NMT). We argue that the choice of embeddings is crucial for such models, so we aim to focus on one particular aspect”:” target representation via embeddings. We explore pretrained embeddings and also introduce knowledge transfer from the discrete Transformer model using embeddings in Euclidean and non-Euclidean spaces. Our results on the WMT Romanian-English and English-Turkish benchmarks show such transfer leads to the best-performing continuous model.
Anthology ID:
2022.repl4nlp-1.24
Volume:
Proceedings of the 7th Workshop on Representation Learning for NLP
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Spandana Gella, He He, Bodhisattwa Prasad Majumder, Burcu Can, Eleonora Giunchiglia, Samuel Cahyawijaya, Sewon Min, Maximilian Mozes, Xiang Lorraine Li, Isabelle Augenstein, Anna Rogers, Kyunghyun Cho, Edward Grefenstette, Laura Rimell, Chris Dyer
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
227–235
Language:
URL:
https://aclanthology.org/2022.repl4nlp-1.24
DOI:
10.18653/v1/2022.repl4nlp-1.24
Bibkey:
Cite (ACL):
Evgeniia Tokarchuk and Vlad Niculae. 2022. On Target Representation in Continuous-output Neural Machine Translation. In Proceedings of the 7th Workshop on Representation Learning for NLP, pages 227–235, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
On Target Representation in Continuous-output Neural Machine Translation (Tokarchuk & Niculae, RepL4NLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.repl4nlp-1.24.pdf
Video:
 https://aclanthology.org/2022.repl4nlp-1.24.mp4