Harsha Vardhan Vemulapati
2022
TLDR at SemEval-2022 Task 1: Using Transformers to Learn Dictionaries and Representations
Aditya Srivastava
|
Harsha Vardhan Vemulapati
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
We propose a pair of deep learning models, which employ unsupervised pretraining, attention mechanisms and contrastive learning for representation learning from dictionary definitions, and definition modeling from such representations. Our systems, the Transformers for Learning Dictionaries and Representations (TLDR), were submitted to the SemEval 2022 Task 1: Comparing Dictionaries and Word Embeddings (CODWOE), where they officially ranked first on the definition modeling subtask, and achieved competitive performance on the reverse dictionary subtask. In this paper we describe our methodology and analyse our system design hypotheses.
Search