Document Level Hierarchical Transformer

Najam Zaidi, Trevor Cohn, Gholamreza Haffari


Abstract
Generating long and coherent text is an important and challenging task encompassing many application areas such as summarization, document level machine translation and story generation. Despite the success in modeling intra-sentence coherence, existing long text generation models (e.g., BART and GPT-3) still struggle to maintain a coherent event sequence throughout the generated text. We conjecture that this is because of the difficulty for the model to revise, replace, revoke or delete any part that has been generated by the model. In this paper, we present a novel semi-autoregressive document generation model capable of revising and editing the generated text. Building on recent models by (Gu et al., 2019; Xu and Carpuat, 2020) we propose document generation as a hierarchical Markov decision process with a two level hierarchy, where the high and low level editing programs. We train our model using imitation learning (Hussein et al., 2017) and introduce roll-in policy such that each policy learns on the output of applying the previous action. Experiments applying the proposed approach sheds various insights on the problems of long text generation using our model. We suggest various remedies such as using distilled dataset, designing better attention mechanisms and using autoregressive models as a low level program.
Anthology ID:
2021.alta-1.13
Volume:
Proceedings of the 19th Annual Workshop of the Australasian Language Technology Association
Month:
December
Year:
2021
Address:
Online
Editors:
Afshin Rahimi, William Lane, Guido Zuccon
Venue:
ALTA
SIG:
Publisher:
Australasian Language Technology Association
Note:
Pages:
128–137
Language:
URL:
https://aclanthology.org/2021.alta-1.13
DOI:
Bibkey:
Cite (ACL):
Najam Zaidi, Trevor Cohn, and Gholamreza Haffari. 2021. Document Level Hierarchical Transformer. In Proceedings of the 19th Annual Workshop of the Australasian Language Technology Association, pages 128–137, Online. Australasian Language Technology Association.
Cite (Informal):
Document Level Hierarchical Transformer (Zaidi et al., ALTA 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.alta-1.13.pdf