Julien Breton
2022
BL.Research at SemEval-2022 Task 1: Deep networks for Reverse Dictionary using embeddings and LSTM autoencoders
Nihed Bendahman
|
Julien Breton
|
Lina Nicolaieff
|
Mokhtar Boumedyen Billami
|
Christophe Bortolaso
|
Youssef Miloudi
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
This paper describes our two deep learning systems that competed at SemEval-2022 Task 1 “CODWOE: Comparing Dictionaries and WOrd Embeddings”. We participated in the subtask for the reverse dictionary which consists in generating vectors from glosses. We use sequential models that integrate several neural networks, starting from Embeddings networks until the use of Dense networks, Bidirectional Long Short-Term Memory (BiLSTM) networks and LSTM networks. All glosses have been preprocessed in order to consider the best representation form of the meanings for all words that appears. We achieved very competitive results in reverse dictionary with a second position in English and French languages when using contextualized embeddings, and the same position for English, French and Spanish languages when using char embeddings.
Search