Interpreting Positional Information in Perspective of Word Order

Zhang Xilong, Liu Ruochen, Liu Jin, Liang Xuefeng


Abstract
The attention mechanism is a powerful and effective method utilized in natural language processing. However, it has been observed that this method is insensitive to positional information. Although several studies have attempted to improve positional encoding and investigate the influence of word order perturbation, it remains unclear how positional encoding impacts NLP models from the perspective of word order. In this paper, we aim to shed light on this problem by analyzing the working mechanism of the attention module and investigating the root cause of its inability to encode positional information. Our hypothesis is that the insensitivity can be attributed to the weight sum operation utilized in the attention module. To verify this hypothesis, we propose a novel weight concatenation operation and evaluate its efficacy in neural machine translation tasks. Our enhanced experimental results not only reveal that the proposed operation can effectively encode positional information but also confirm our hypothesis.
Anthology ID:
2023.acl-long.534
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9600–9613
Language:
URL:
https://aclanthology.org/2023.acl-long.534
DOI:
10.18653/v1/2023.acl-long.534
Bibkey:
Cite (ACL):
Zhang Xilong, Liu Ruochen, Liu Jin, and Liang Xuefeng. 2023. Interpreting Positional Information in Perspective of Word Order. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 9600–9613, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Interpreting Positional Information in Perspective of Word Order (Xilong et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.534.pdf
Video:
 https://aclanthology.org/2023.acl-long.534.mp4