Nattapol Trijakwanich
2021
Robust Fragment-Based Framework for Cross-lingual Sentence Retrieval
Nattapol Trijakwanich
|
Peerat Limkonchotiwat
|
Raheem Sarwar
|
Wannaphong Phatthiyaphaibun
|
Ekapol Chuangsuwanich
|
Sarana Nutanong
Findings of the Association for Computational Linguistics: EMNLP 2021
Cross-lingual Sentence Retrieval (CLSR) aims at retrieving parallel sentence pairs that are translations of each other from a multilingual set of comparable documents. The retrieved parallel sentence pairs can be used in other downstream NLP tasks such as machine translation and cross-lingual word sense disambiguation. We propose a CLSR framework called Robust Fragment-level Representation (RFR) CLSR framework to address Out-of-Domain (OOD) CLSR problems. In particular, we improve the sentence retrieval robustness by representing each sentence as a collection of fragments. In this way, we change the retrieval granularity from the sentence to the fragment level. We performed CLSR experiments based on three OOD datasets, four language pairs, and three base well-known sentence encoders: m-USE, LASER, and LaBSE. Experimental results show that RFR significantly improves the base encoders’ performance for more than 85% of the cases.