Cooking Up a Neural-based Model for Recipe Classification

Elham Mohammadi, Nada Naji, Louis Marceau, Marc Queudot, Eric Charton, Leila Kosseim, Marie-Jean Meurs


Abstract
In this paper, we propose a neural-based model to address the first task of the DEFT 2013 shared task, with the main challenge of a highly imbalanced dataset, using state-of-the-art embedding approaches and deep architectures. We report on our experiments on the use of linguistic features, extracted by Charton et. al. (2014), in different neural models utilizing pretrained embeddings. Our results show that all of the models that use linguistic features outperform their counterpart models that only use pretrained embeddings. The best performing model uses pretrained CamemBERT embeddings as input and CNN as the hidden layer, and uses additional linguistic features. Adding the linguistic features to this model improves its performance by 4.5% and 11.4% in terms of micro and macro F1 scores, respectively, leading to state-of-the-art results and an improved classification of the rare classes.
Anthology ID:
2020.lrec-1.615
Volume:
Proceedings of the Twelfth Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
5000–5009
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.615
DOI:
Bibkey:
Cite (ACL):
Elham Mohammadi, Nada Naji, Louis Marceau, Marc Queudot, Eric Charton, Leila Kosseim, and Marie-Jean Meurs. 2020. Cooking Up a Neural-based Model for Recipe Classification. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 5000–5009, Marseille, France. European Language Resources Association.
Cite (Informal):
Cooking Up a Neural-based Model for Recipe Classification (Mohammadi et al., LREC 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.lrec-1.615.pdf