The Devil is in the Details: Evaluating Limitations of Transformer-based Methods for Granular Tasks

Brihi Joshi, Neil Shah, Francesco Barbieri, Leonardo Neves


Abstract
Contextual embeddings derived from transformer-based neural language models have shown state-of-the-art performance for various tasks such as question answering, sentiment analysis, and textual similarity in recent years. Extensive work shows how accurately such models can represent abstract, semantic information present in text. In this expository work, we explore a tangent direction and analyze such models’ performance on tasks that require a more granular level of representation. We focus on the problem of textual similarity from two perspectives: matching documents on a granular level (requiring embeddings to capture fine-grained attributes in the text), and an abstract level (requiring embeddings to capture overall textual semantics). We empirically demonstrate, across two datasets from different domains, that despite high performance in abstract document matching as expected, contextual embeddings are consistently (and at times, vastly) outperformed by simple baselines like TF-IDF for more granular tasks. We then propose a simple but effective method to incorporate TF-IDF into models that use contextual embeddings, achieving relative improvements of up to 36% on granular tasks.
Anthology ID:
2020.coling-main.326
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3652–3659
Language:
URL:
https://aclanthology.org/2020.coling-main.326
DOI:
10.18653/v1/2020.coling-main.326
Bibkey:
Cite (ACL):
Brihi Joshi, Neil Shah, Francesco Barbieri, and Leonardo Neves. 2020. The Devil is in the Details: Evaluating Limitations of Transformer-based Methods for Granular Tasks. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3652–3659, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
The Devil is in the Details: Evaluating Limitations of Transformer-based Methods for Granular Tasks (Joshi et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.326.pdf
Code
 brihijoshi/granular-similarity-COLING-2020
Data
GLUE