Multilinguals at SemEval-2022 Task 11: Transformer Based Architecture for Complex NER

Amit Pandey, Swayatta Daw, Vikram Pudi


Abstract
We investigate the task of complex NER for the English language. The task is non-trivial due to the semantic ambiguity of the textual structure and the rarity of occurrence of such entities in the prevalent literature. Using pre-trained language models such as BERT, we obtain a competitive performance on this task. We qualitatively analyze the performance of multiple architectures for this task. All our models are able to outperform the baseline by a significant margin. Our best performing model beats the baseline F1-score by over 9%.
Anthology ID:
2022.semeval-1.224
Volume:
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Guy Emerson, Natalie Schluter, Gabriel Stanovsky, Ritesh Kumar, Alexis Palmer, Nathan Schneider, Siddharth Singh, Shyam Ratan
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
1623–1629
Language:
URL:
https://aclanthology.org/2022.semeval-1.224
DOI:
10.18653/v1/2022.semeval-1.224
Bibkey:
Cite (ACL):
Amit Pandey, Swayatta Daw, and Vikram Pudi. 2022. Multilinguals at SemEval-2022 Task 11: Transformer Based Architecture for Complex NER. In Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022), pages 1623–1629, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Multilinguals at SemEval-2022 Task 11: Transformer Based Architecture for Complex NER (Pandey et al., SemEval 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.semeval-1.224.pdf
Code
 amitpandey-research/complex_ner
Data
MultiCoNER