Building a Bridge: A Method for Image-Text Sarcasm Detection Without Pretraining on Image-Text Data

Xinyu Wang, Xiaowen Sun, Tan Yang, Hongbo Wang


Abstract
Sarcasm detection in social media with text and image is becoming more challenging. Previous works of image-text sarcasm detection were mainly to fuse the summaries of text and image: different sub-models read the text and image respectively to get the summaries, and fuses the summaries. Recently, some multi-modal models based on the architecture of BERT are proposed such as ViLBERT. However, they can only be pretrained on the image-text data. In this paper, we propose an image-text model for sarcasm detection using the pretrained BERT and ResNet without any further pretraining. BERT and ResNet have been pretrained on much larger text or image data than image-text data. We connect the vector spaces of BERT and ResNet to utilize more data. We use the pretrained Multi-Head Attention of BERT to model the text and image. Besides, we propose a 2D-Intra-Attention to extract the relationships between words and images. In experiments, our model outperforms the state-of-the-art model.
Anthology ID:
2020.nlpbt-1.3
Volume:
Proceedings of the First International Workshop on Natural Language Processing Beyond Text
Month:
November
Year:
2020
Address:
Online
Editors:
Giuseppe Castellucci, Simone Filice, Soujanya Poria, Erik Cambria, Lucia Specia
Venue:
nlpbt
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
19–29
Language:
URL:
https://aclanthology.org/2020.nlpbt-1.3
DOI:
10.18653/v1/2020.nlpbt-1.3
Bibkey:
Cite (ACL):
Xinyu Wang, Xiaowen Sun, Tan Yang, and Hongbo Wang. 2020. Building a Bridge: A Method for Image-Text Sarcasm Detection Without Pretraining on Image-Text Data. In Proceedings of the First International Workshop on Natural Language Processing Beyond Text, pages 19–29, Online. Association for Computational Linguistics.
Cite (Informal):
Building a Bridge: A Method for Image-Text Sarcasm Detection Without Pretraining on Image-Text Data (Wang et al., nlpbt 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.nlpbt-1.3.pdf