NevIR: Negation in Neural Information Retrieval

Orion Weller, Dawn Lawrie, Benjamin Van Durme


Abstract
Negation is a common everyday phenomena and has been a consistent area of weakness for language models (LMs). Although the Information Retrieval (IR) community has adopted LMs as the backbone of modern IR architectures, there has been little to no research in understanding how negation impacts neural IR. We therefore construct a straightforward benchmark on this theme: asking IR models to rank two documents that differ only by negation. We show that the results vary widely according to the type of IR architecture: cross-encoders perform best, followed by late-interaction models, and in last place are bi-encoder and sparse neural architectures. We find that most current information retrieval models do not consider negation, performing similarly or worse than randomly ranking. We show that although the obvious approach of continued fine-tuning on a dataset of contrastive documents containing negations increases performance (as does model size), there is still a large gap between machine and human performance.
Anthology ID:
2024.eacl-long.139
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2274–2287
Language:
URL:
https://aclanthology.org/2024.eacl-long.139
DOI:
Bibkey:
Cite (ACL):
Orion Weller, Dawn Lawrie, and Benjamin Van Durme. 2024. NevIR: Negation in Neural Information Retrieval. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2274–2287, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
NevIR: Negation in Neural Information Retrieval (Weller et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-long.139.pdf
Note:
 2024.eacl-long.139.note.zip