Jacob K. Johnson
2024
Online Learning of ITSL Grammars
Jacob K. Johnson
|
Aniello De Santo
Proceedings of the Society for Computation in Linguistics 2024
2023
How Much Consistency Is Your Accuracy Worth?
Jacob K. Johnson
|
Ana Marasović
Proceedings of the 6th BlackboxNLP Workshop: Analyzing and Interpreting Neural Networks for NLP
Contrast set consistency is a robustness measurement that evaluates the rate at which a model correctly responds to all instances in a bundle of minimally different examples relying on the same knowledge. To draw additional insights, we propose to complement consistency with relative consistency—the probability that an equally accurate model would surpass the consistency of the proposed model, given a distribution over possible consistencies. Models with 100% relative consistency have reached a consistency peak for their accuracy. We reflect on prior work that reports consistency in contrast sets and observe that relative consistency can alter the assessment of a model’s consistency compared to another. We anticipate that our proposed measurement and insights will influence future studies aiming to promote consistent behavior in models.
Search