TLDR at SemEval-2022 Task 1: Using Transformers to Learn Dictionaries and Representations

Aditya Srivastava, Harsha Vardhan Vemulapati


Abstract
We propose a pair of deep learning models, which employ unsupervised pretraining, attention mechanisms and contrastive learning for representation learning from dictionary definitions, and definition modeling from such representations. Our systems, the Transformers for Learning Dictionaries and Representations (TLDR), were submitted to the SemEval 2022 Task 1: Comparing Dictionaries and Word Embeddings (CODWOE), where they officially ranked first on the definition modeling subtask, and achieved competitive performance on the reverse dictionary subtask. In this paper we describe our methodology and analyse our system design hypotheses.
Anthology ID:
2022.semeval-1.6
Volume:
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Guy Emerson, Natalie Schluter, Gabriel Stanovsky, Ritesh Kumar, Alexis Palmer, Nathan Schneider, Siddharth Singh, Shyam Ratan
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
60–67
Language:
URL:
https://aclanthology.org/2022.semeval-1.6
DOI:
10.18653/v1/2022.semeval-1.6
Bibkey:
Cite (ACL):
Aditya Srivastava and Harsha Vardhan Vemulapati. 2022. TLDR at SemEval-2022 Task 1: Using Transformers to Learn Dictionaries and Representations. In Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022), pages 60–67, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
TLDR at SemEval-2022 Task 1: Using Transformers to Learn Dictionaries and Representations (Srivastava & Vemulapati, SemEval 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.semeval-1.6.pdf
Video:
 https://aclanthology.org/2022.semeval-1.6.mp4