Evaluating Research Novelty Detection: Counterfactual Approaches

Reinald Kim Amplayo, Seung-won Hwang, Min Song


Abstract
In this paper, we explore strategies to evaluate models for the task research paper novelty detection: Given all papers released at a given date, which of the papers discuss new ideas and influence future research? We find the novelty is not a singular concept, and thus inherently lacks of ground truth annotations with cross-annotator agreement, which is a major obstacle in evaluating these models. Test-of-time award is closest to such annotation, which can only be made retrospectively and is extremely scarce. We thus propose to compare and evaluate models using counterfactual simulations. First, we ask models if they can differentiate papers at time t and counterfactual paper from future time t+d. Second, we ask models if they can predict test-of-time award at t+d. These are proxies that can be agreed by human annotators and easily augmented by correlated signals, using which evaluation can be done through four tasks: classification, ranking, correlation and feature selection. We show these proxy evaluation methods complement each other regarding error handling, coverage, interpretability, and scope, and thus altogether contribute to the observation of the relative strength of existing models.
Anthology ID:
D19-5315
Volume:
Proceedings of the Thirteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-13)
Month:
November
Year:
2019
Address:
Hong Kong
Editors:
Dmitry Ustalov, Swapna Somasundaran, Peter Jansen, Goran Glavaš, Martin Riedl, Mihai Surdeanu, Michalis Vazirgiannis
Venue:
TextGraphs
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
124–133
Language:
URL:
https://aclanthology.org/D19-5315
DOI:
10.18653/v1/D19-5315
Bibkey:
Cite (ACL):
Reinald Kim Amplayo, Seung-won Hwang, and Min Song. 2019. Evaluating Research Novelty Detection: Counterfactual Approaches. In Proceedings of the Thirteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-13), pages 124–133, Hong Kong. Association for Computational Linguistics.
Cite (Informal):
Evaluating Research Novelty Detection: Counterfactual Approaches (Amplayo et al., TextGraphs 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5315.pdf