Transformer-XL: Attentive Language Models beyond a Fixed-Length Context Zihang Dai author Zhilin Yang author Yiming Yang author Jaime Carbonell author Quoc Le author Ruslan Salakhutdinov author 2019-07 text Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics Anna Korhonen editor David Traum editor Lluís Màrquez editor Association for Computational Linguistics Florence, Italy conference publication dai-etal-2019-transformer 10.18653/v1/P19-1285 https://aclanthology.org/P19-1285/ 2019-07 2978 2988