The Case for Translation-Invariant Self-Attention in Transformer-Based Language Models Ulme Wennberg author Gustav Eje Henter author 2021-08 text Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers) Chengqing Zong editor Fei Xia editor Wenjie Li editor Roberto Navigli editor Association for Computational Linguistics Online conference publication wennberg-henter-2021-case 10.18653/v1/2021.acl-short.18 https://aclanthology.org/2021.acl-short.18/ 2021-08 130 140