Improving Neural Metaphor Detection with Visual Datasets

Gitit Kehat, James Pustejovsky


Abstract
We present new results on Metaphor Detection by using text from visual datasets. Using a straightforward technique for sampling text from Vision-Language datasets, we create a data structure we term a visibility word embedding. We then combine these embeddings in a relatively simple BiLSTM module augmented with contextualized word representations (ELMo), and show improvement over previous state-of-the-art approaches that use more complex neural network architectures and richer linguistic features, for the task of verb classification.
Anthology ID:
2020.lrec-1.726
Volume:
Proceedings of the Twelfth Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
5928–5933
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.726
DOI:
Bibkey:
Cite (ACL):
Gitit Kehat and James Pustejovsky. 2020. Improving Neural Metaphor Detection with Visual Datasets. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 5928–5933, Marseille, France. European Language Resources Association.
Cite (Informal):
Improving Neural Metaphor Detection with Visual Datasets (Kehat & Pustejovsky, LREC 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.lrec-1.726.pdf