When Reviewers Lock Horns: Finding Disagreements in Scientific Peer Reviews

Sandeep Kumar, Tirthankar Ghosal, Asif Ekbal


Abstract
To this date, the efficacy of the scientific publishing enterprise fundamentally rests on the strength of the peer review process. The journal editor or the conference chair primarily relies on the expert reviewers’ assessment, identify points of agreement and disagreement and try to reach a consensus to make a fair and informed decision on whether to accept or reject a paper. However, with the escalating number of submissions requiring review, especially in top-tier Artificial Intelligence (AI) conferences, the editor/chair, among many other works, invests a significant, sometimes stressful effort to mitigate reviewer disagreements. Here in this work, we introduce a novel task of automatically identifying contradictions among reviewers on a given article. To this end, we introduce ContraSciView, a comprehensive review-pair contradiction dataset on around 8.5k papers (with around 28k review pairs containing nearly 50k review pair comments) from the open review-based ICLR and NeurIPS conferences. We further propose a baseline model that detects contradictory statements from the review pairs. To the best of our knowledge, we make the first attempt to identify disagreements among peer reviewers automatically. We make our dataset and code public for further investigations.
Anthology ID:
2023.emnlp-main.1038
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16693–16704
Language:
URL:
https://aclanthology.org/2023.emnlp-main.1038
DOI:
10.18653/v1/2023.emnlp-main.1038
Bibkey:
Cite (ACL):
Sandeep Kumar, Tirthankar Ghosal, and Asif Ekbal. 2023. When Reviewers Lock Horns: Finding Disagreements in Scientific Peer Reviews. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 16693–16704, Singapore. Association for Computational Linguistics.
Cite (Informal):
When Reviewers Lock Horns: Finding Disagreements in Scientific Peer Reviews (Kumar et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.1038.pdf