Random Text Perturbations Work, but not Always

Zhengxiang Wang


Anthology ID:
2022.eval4nlp-1.6
Volume:
Proceedings of the 3rd Workshop on Evaluation and Comparison of NLP Systems
Month:
November
Year:
2022
Address:
Online
Editors:
Daniel Deutsch, Can Udomcharoenchaikit, Juri Opitz, Yang Gao, Marina Fomicheva, Steffen Eger
Venue:
Eval4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
51–57
Language:
URL:
https://aclanthology.org/2022.eval4nlp-1.6
DOI:
10.18653/v1/2022.eval4nlp-1.6
Bibkey:
Cite (ACL):
Zhengxiang Wang. 2022. Random Text Perturbations Work, but not Always. In Proceedings of the 3rd Workshop on Evaluation and Comparison of NLP Systems, pages 51–57, Online. Association for Computational Linguistics.
Cite (Informal):
Random Text Perturbations Work, but not Always (Wang, Eval4NLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.eval4nlp-1.6.pdf
Supplementary material:
 2022.eval4nlp-1.6.SupplementaryMaterial.zip