Big Bidirectional Insertion Representations for Documents

Lala Li, William Chan


Abstract
The Insertion Transformer is well suited for long form text generation due to its parallel generation capabilities, requiring O(log2 n) generation steps to generate n tokens. However, modeling long sequences is difficult, as there is more ambiguity captured in the attention mechanism. This work proposes the Big Bidirectional Insertion Representations for Documents (Big BIRD), an insertion-based model for document-level translation tasks. We scale up the insertion-based models to long form documents. Our key contribution is introducing sentence alignment via sentence-positional embeddings between the source and target document. We show an improvement of +4.3 BLEU on the WMT’19 English->German document-level translation task compared with the Insertion Transformer baseline.
Anthology ID:
D19-5620
Volume:
Proceedings of the 3rd Workshop on Neural Generation and Translation
Month:
November
Year:
2019
Address:
Hong Kong
Editors:
Alexandra Birch, Andrew Finch, Hiroaki Hayashi, Ioannis Konstas, Thang Luong, Graham Neubig, Yusuke Oda, Katsuhito Sudoh
Venue:
NGT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
194–198
Language:
URL:
https://aclanthology.org/D19-5620
DOI:
10.18653/v1/D19-5620
Bibkey:
Cite (ACL):
Lala Li and William Chan. 2019. Big Bidirectional Insertion Representations for Documents. In Proceedings of the 3rd Workshop on Neural Generation and Translation, pages 194–198, Hong Kong. Association for Computational Linguistics.
Cite (Informal):
Big Bidirectional Insertion Representations for Documents (Li & Chan, NGT 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5620.pdf