Retrofitting Multilingual Sentence Embeddings with Abstract Meaning Representation

Deng Cai, Xin Li, Jackie Chun-Sing Ho, Lidong Bing, Wai Lam


Abstract
We introduce a new method to improve existing multilingual sentence embeddings with Abstract Meaning Representation (AMR). Compared with the original textual input, AMR is a structured semantic representation that presents the core concepts and relations in a sentence explicitly and unambiguously. It also helps reduce the surface variations across different expressions and languages. Unlike most prior work that only evaluates the ability to measure semantic similarity, we present a thorough evaluation of existing multilingual sentence embeddings and our improved versions, which include a collection of five transfer tasks in different downstream applications. Experiment results show that retrofitting multilingual sentence embeddings with AMR leads to better state-of-the-art performance on both semantic textual similarity and transfer tasks.
Anthology ID:
2022.emnlp-main.433
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6456–6472
Language:
URL:
https://aclanthology.org/2022.emnlp-main.433
DOI:
10.18653/v1/2022.emnlp-main.433
Bibkey:
Cite (ACL):
Deng Cai, Xin Li, Jackie Chun-Sing Ho, Lidong Bing, and Wai Lam. 2022. Retrofitting Multilingual Sentence Embeddings with Abstract Meaning Representation. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 6456–6472, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Retrofitting Multilingual Sentence Embeddings with Abstract Meaning Representation (Cai et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.433.pdf