What do Humor Classifiers Learn? An Attempt to Explain Humor Recognition Models

Marcio Lima Inácio, Gabriela Wick-pedro, Hugo Goncalo Oliveira


Abstract
Towards computational systems capable of dealing with complex and general linguistic phenomena, it is essential to understand figurative language, which verbal humor is an instance of. This paper reports state-of-the-art results for Humor Recognition in Portuguese, specifically, an F1-score of 99.64% with a BERT-based classifier. However, following the surprising high performance in such a challenging task, we further analyzed what was actually learned by the classifiers. Our main conclusions were that classifiers based on content-features achieve the best performance, but rely mostly on stylistic aspects of the text, not necessarily related to humor, such as punctuation and question words. On the other hand, for humor-related features, we identified some important aspects, such as the presence of named entities, ambiguity and incongruity.
Anthology ID:
2023.latechclfl-1.10
Volume:
Proceedings of the 7th Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities and Literature
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Stefania Degaetano-Ortlieb, Anna Kazantseva, Nils Reiter, Stan Szpakowicz
Venue:
LaTeCHCLfL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
88–98
Language:
URL:
https://aclanthology.org/2023.latechclfl-1.10
DOI:
10.18653/v1/2023.latechclfl-1.10
Bibkey:
Cite (ACL):
Marcio Lima Inácio, Gabriela Wick-pedro, and Hugo Goncalo Oliveira. 2023. What do Humor Classifiers Learn? An Attempt to Explain Humor Recognition Models. In Proceedings of the 7th Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities and Literature, pages 88–98, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
What do Humor Classifiers Learn? An Attempt to Explain Humor Recognition Models (Lima Inácio et al., LaTeCHCLfL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.latechclfl-1.10.pdf
Video:
 https://aclanthology.org/2023.latechclfl-1.10.mp4