Sample Efficient Approaches for Idiomaticity Detection

Dylan Phelps, Xuan-Rui Fan, Edward Gow-Smith, Harish Tayyar Madabushi, Carolina Scarton, Aline Villavicencio


Abstract
Deep neural models, in particular Transformer-based pre-trained language models, require a significant amount of data to train. This need for data tends to lead to problems when dealing with idiomatic multiword expressions (MWEs), which are inherently less frequent in natural text. As such, this work explores sample efficient methods of idiomaticity detection. In particular we study the impact of Pattern Exploit Training (PET), a few-shot method of classification, and BERTRAM, an efficient method of creating contextual embeddings, on the task of idiomaticity detection. In addition, to further explore generalisability, we focus on the identification of MWEs not present in the training data. Our experiments show that while these methods improve performance on English, they are much less effective on Portuguese and Galician, leading to an overall performance about on par with vanilla mBERT. Regardless, we believe sample efficient methods for both identifying and representing potentially idiomatic MWEs are very encouraging and hold significant potential for future exploration.
Anthology ID:
2022.mwe-1.15
Volume:
Proceedings of the 18th Workshop on Multiword Expressions @LREC2022
Month:
June
Year:
2022
Address:
Marseille, France
Venue:
MWE
SIG:
SIGLEX
Publisher:
European Language Resources Association
Note:
Pages:
105–111
Language:
URL:
https://aclanthology.org/2022.mwe-1.15
DOI:
Bibkey:
Cite (ACL):
Dylan Phelps, Xuan-Rui Fan, Edward Gow-Smith, Harish Tayyar Madabushi, Carolina Scarton, and Aline Villavicencio. 2022. Sample Efficient Approaches for Idiomaticity Detection. In Proceedings of the 18th Workshop on Multiword Expressions @LREC2022, pages 105–111, Marseille, France. European Language Resources Association.
Cite (Informal):
Sample Efficient Approaches for Idiomaticity Detection (Phelps et al., MWE 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.mwe-1.15.pdf