Towards preserving word order importance through Forced Invalidation

Hadeel Al-Negheimish, Pranava Madhyastha, Alessandra Russo


Abstract
Large pre-trained language models such as BERT have been widely used as a framework for natural language understanding (NLU) tasks. However, recent findings have revealed that pre-trained language models are insensitive to word order. The performance on NLU tasks remains unchanged even after randomly permuting the word of a sentence, where crucial syntactic information is destroyed. To help preserve the importance of word order, we propose a simple approach called Forced Invalidation (FI): forcing the model to identify permuted sequences as invalid samples. We perform an extensive evaluation of our approach on various English NLU and QA based tasks over BERT-based and attention-based models over word embeddings. Our experiments demonstrate that FI significantly improves the sensitivity of the models to word order.
Anthology ID:
2023.eacl-main.187
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2563–2570
Language:
URL:
https://aclanthology.org/2023.eacl-main.187
DOI:
10.18653/v1/2023.eacl-main.187
Bibkey:
Cite (ACL):
Hadeel Al-Negheimish, Pranava Madhyastha, and Alessandra Russo. 2023. Towards preserving word order importance through Forced Invalidation. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 2563–2570, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Towards preserving word order importance through Forced Invalidation (Al-Negheimish et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.187.pdf
Video:
 https://aclanthology.org/2023.eacl-main.187.mp4