Investigation of Transformer-based Latent Attention Models for Neural Machine Translation

Parnia Bahar, Nikita Makarov, Hermann Ney


Anthology ID:
2020.amta-research.2
Volume:
Proceedings of the 14th Conference of the Association for Machine Translation in the Americas (Volume 1: Research Track)
Month:
October
Year:
2020
Address:
Virtual
Editors:
Michael Denkowski, Christian Federmann
Venue:
AMTA
SIG:
Publisher:
Association for Machine Translation in the Americas
Note:
Pages:
7–20
Language:
URL:
https://aclanthology.org/2020.amta-research.2
DOI:
Bibkey:
Cite (ACL):
Parnia Bahar, Nikita Makarov, and Hermann Ney. 2020. Investigation of Transformer-based Latent Attention Models for Neural Machine Translation. In Proceedings of the 14th Conference of the Association for Machine Translation in the Americas (Volume 1: Research Track), pages 7–20, Virtual. Association for Machine Translation in the Americas.
Cite (Informal):
Investigation of Transformer-based Latent Attention Models for Neural Machine Translation (Bahar et al., AMTA 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.amta-research.2.pdf