Evaluating Pretrained Transformer Models for Entity Linking inTask-Oriented Dialog

Sai Muralidhar Jayanthi, Varsha Embar, Karthik Raghunathan


Abstract
The wide applicability of pretrained transformer models (PTMs) for natural language tasks is well demonstrated, but their ability to comprehend short phrases of text is less explored. To this end, we evaluate different PTMs from the lens of unsupervised Entity Linking in task-oriented dialog across 5 characteristics– syntactic, semantic, short-forms, numeric and phonetic. Our results demonstrate that several of the PTMs produce sub-par results when compared to traditional techniques, albeit competitive to other neural baselines. We find that some of their shortcomings can be addressed by using PTMs fine-tuned for text-similarity tasks, which illustrate an improved ability in comprehending semantic and syntactic correspondences, as well as some improvements for short-forms, numeric and phonetic variations in entity mentions. We perform qualitative analysis to understand nuances in their predictions and discuss scope for further improvements.
Anthology ID:
2021.icon-main.65
Volume:
Proceedings of the 18th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2021
Address:
National Institute of Technology Silchar, Silchar, India
Editors:
Sivaji Bandyopadhyay, Sobha Lalitha Devi, Pushpak Bhattacharyya
Venue:
ICON
SIG:
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
537–543
Language:
URL:
https://aclanthology.org/2021.icon-main.65
DOI:
Bibkey:
Cite (ACL):
Sai Muralidhar Jayanthi, Varsha Embar, and Karthik Raghunathan. 2021. Evaluating Pretrained Transformer Models for Entity Linking inTask-Oriented Dialog. In Proceedings of the 18th International Conference on Natural Language Processing (ICON), pages 537–543, National Institute of Technology Silchar, Silchar, India. NLP Association of India (NLPAI).
Cite (Informal):
Evaluating Pretrained Transformer Models for Entity Linking inTask-Oriented Dialog (Jayanthi et al., ICON 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.icon-main.65.pdf
Optional supplementary material:
 2021.icon-main.65.OptionalSupplementaryMaterial.pdf
Code
 murali1996/el_tod
Data
Acronym IdentificationMKQA