Unsupervised Attention-based Sentence-Level Meta-Embeddings from Contextualised Language Models

Keigo Takahashi, Danushka Bollegala


Abstract
A variety of contextualised language models have been proposed in the NLP community, which are trained on diverse corpora to produce numerous Neural Language Models (NLMs). However, different NLMs have reported different levels of performances in downstream NLP applications when used as text representations. We propose a sentence-level meta-embedding learning method that takes independently trained contextualised word embedding models and learns a sentence embedding that preserves the complementary strengths of the input source NLMs. Our proposed method is unsupervised and is not tied to a particular downstream task, which makes the learnt meta-embeddings in principle applicable to different tasks that require sentence representations. Specifically, we first project the token-level embeddings obtained by the individual NLMs and learn attention weights that indicate the contributions of source embeddings towards their token-level meta-embeddings. Next, we apply mean and max pooling to produce sentence-level meta-embeddings from token-level meta-embeddings. Experimental results on semantic textual similarity benchmarks show that our proposed unsupervised sentence-level meta-embedding method outperforms previously proposed sentence-level meta-embedding methods as well as a supervised baseline.
Anthology ID:
2022.lrec-1.775
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
7155–7163
Language:
URL:
https://aclanthology.org/2022.lrec-1.775
DOI:
Bibkey:
Cite (ACL):
Keigo Takahashi and Danushka Bollegala. 2022. Unsupervised Attention-based Sentence-Level Meta-Embeddings from Contextualised Language Models. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 7155–7163, Marseille, France. European Language Resources Association.
Cite (Informal):
Unsupervised Attention-based Sentence-Level Meta-Embeddings from Contextualised Language Models (Takahashi & Bollegala, LREC 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.lrec-1.775.pdf