Robustness Gym: Unifying the NLP Evaluation Landscape

Karan Goel, Nazneen Fatema Rajani, Jesse Vig, Zachary Taschdjian, Mohit Bansal, Christopher Ré


Abstract
Despite impressive performance on standard benchmarks, natural language processing (NLP) models are often brittle when deployed in real-world systems. In this work, we identify challenges with evaluating NLP systems and propose a solution in the form of Robustness Gym (RG), a simple and extensible evaluation toolkit that unifies 4 standard evaluation paradigms: subpopulations, transformations, evaluation sets, and adversarial attacks. By providing a common platform for evaluation, RG enables practitioners to compare results from disparate evaluation paradigms with a single click, and to easily develop and share novel evaluation methods using a built-in set of abstractions. RG is under active development and we welcome feedback & contributions from the community.
Anthology ID:
2021.naacl-demos.6
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Demonstrations
Month:
June
Year:
2021
Address:
Online
Editors:
Avi Sil, Xi Victoria Lin
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
42–55
Language:
URL:
https://aclanthology.org/2021.naacl-demos.6
DOI:
10.18653/v1/2021.naacl-demos.6
Bibkey:
Cite (ACL):
Karan Goel, Nazneen Fatema Rajani, Jesse Vig, Zachary Taschdjian, Mohit Bansal, and Christopher Ré. 2021. Robustness Gym: Unifying the NLP Evaluation Landscape. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Demonstrations, pages 42–55, Online. Association for Computational Linguistics.
Cite (Informal):
Robustness Gym: Unifying the NLP Evaluation Landscape (Goel et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-demos.6.pdf
Code
 robustness-gym/robustness-gym +  additional community code
Data
ANLIMultiNLISNLI