TEDB System Description to a Shared Task on Euphemism Detection 2022

Peratham Wiriyathammabhum


Abstract
In this report, we describe our Transformers for euphemism detection baseline (TEDB) submissions to a shared task on euphemism detection 2022. We cast the task of predicting euphemism as text classification. We considered Transformer-based models which are the current state-of-the-art methods for text classification. We explored different training schemes, pretrained models, and model architectures. Our best result of 0.816 F1-score (0.818 precision and 0.814 recall) consists of a euphemism-detection-finetuned TweetEval/TimeLMs-pretrained RoBERTa model as a feature extractor frontend with a KimCNN classifier backend trained end-to-end using a cosine annealing scheduler. We observed pretrained models on sentiment analysis and offensiveness detection to correlate with more F1-score while pretraining on other tasks, such as sarcasm detection, produces less F1-scores. Also, putting more word vector channels does not improve the performance in our experiments.
Anthology ID:
2022.flp-1.1
Volume:
Proceedings of the 3rd Workshop on Figurative Language Processing (FLP)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Debanjan Ghosh, Beata Beigman Klebanov, Smaranda Muresan, Anna Feldman, Soujanya Poria, Tuhin Chakrabarty
Venue:
Fig-Lang
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–7
Language:
URL:
https://aclanthology.org/2022.flp-1.1
DOI:
10.18653/v1/2022.flp-1.1
Bibkey:
Cite (ACL):
Peratham Wiriyathammabhum. 2022. TEDB System Description to a Shared Task on Euphemism Detection 2022. In Proceedings of the 3rd Workshop on Figurative Language Processing (FLP), pages 1–7, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
TEDB System Description to a Shared Task on Euphemism Detection 2022 (Wiriyathammabhum, Fig-Lang 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.flp-1.1.pdf
Video:
 https://aclanthology.org/2022.flp-1.1.mp4