How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers Michael Hassid author Hao Peng author Daniel Rotem author Jungo Kasai author Ivan Montero author Noah A Smith author Roy Schwartz author 2022-12 text Findings of the Association for Computational Linguistics: EMNLP 2022 Yoav Goldberg editor Zornitsa Kozareva editor Yue Zhang editor Association for Computational Linguistics Abu Dhabi, United Arab Emirates conference publication hassid-etal-2022-much 10.18653/v1/2022.findings-emnlp.101 https://aclanthology.org/2022.findings-emnlp.101/ 2022-12 1403 1416