Multilingual Sentence Transformer as A Multilingual Word Aligner

Weikang Wang, Guanhua Chen, Hanqing Wang, Yue Han, Yun Chen


Abstract
Multilingual pretrained language models (mPLMs) have shown their effectiveness in multilingual word alignment induction. However, these methods usually start from mBERT or XLM-R. In this paper, we investigate whether multilingual sentence Transformer LaBSE is a strong multilingual word aligner. This idea is non-trivial as LaBSE is trained to learn language-agnostic sentence-level embeddings, while the alignment extraction task requires the more fine-grained word-level embeddings to be language-agnostic. We demonstrate that the vanilla LaBSE outperforms other mPLMs currently used in the alignment task, and then propose to finetune LaBSE on parallel corpus for further improvement. Experiment results on seven language pairs show that our best aligner outperforms previous state-of-the-art models of all varieties. In addition, our aligner supports different language pairs in a single model, and even achieves new state-of-the-art on zero-shot language pairs that does not appear in the finetuning process.
Anthology ID:
2022.findings-emnlp.215
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2952–2963
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.215
DOI:
10.18653/v1/2022.findings-emnlp.215
Bibkey:
Cite (ACL):
Weikang Wang, Guanhua Chen, Hanqing Wang, Yue Han, and Yun Chen. 2022. Multilingual Sentence Transformer as A Multilingual Word Aligner. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 2952–2963, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Multilingual Sentence Transformer as A Multilingual Word Aligner (Wang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.215.pdf
Video:
 https://aclanthology.org/2022.findings-emnlp.215.mp4