A Multilingual Evaluation of NER Robustness to Adversarial Inputs

Akshay Srinivasan, Sowmya Vajjala


Abstract
Adversarial evaluations of language models typically focus on English alone. In this paper, we performed a multilingual evaluation of Named Entity Recognition (NER) in terms of its robustness to small perturbations in the input. Our results showed the NER models we explored across three languages (English, German and Hindi) are not very robust to such changes, as indicated by the fluctuations in the overall F1 score as well as in a more fine-grained evaluation. With that knowledge, we further explored whether it is possible to improve the existing NER models using a part of the generated adversarial data sets as augmented training data to train a new NER model or as fine-tuning data to adapt an existing NER model. Our results showed that both these approaches improve performance on the original as well as adversarial test sets. While there is no significant difference between the two approaches for English, re-training is significantly better than fine-tuning for German and Hindi.
Anthology ID:
2023.repl4nlp-1.4
Volume:
Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Burcu Can, Maximilian Mozes, Samuel Cahyawijaya, Naomi Saphra, Nora Kassner, Shauli Ravfogel, Abhilasha Ravichander, Chen Zhao, Isabelle Augenstein, Anna Rogers, Kyunghyun Cho, Edward Grefenstette, Lena Voita
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
40–53
Language:
URL:
https://aclanthology.org/2023.repl4nlp-1.4
DOI:
10.18653/v1/2023.repl4nlp-1.4
Bibkey:
Cite (ACL):
Akshay Srinivasan and Sowmya Vajjala. 2023. A Multilingual Evaluation of NER Robustness to Adversarial Inputs. In Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023), pages 40–53, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
A Multilingual Evaluation of NER Robustness to Adversarial Inputs (Srinivasan & Vajjala, RepL4NLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.repl4nlp-1.4.pdf
Video:
 https://aclanthology.org/2023.repl4nlp-1.4.mp4