Enhancing Cross-lingual Natural Language Inference by Prompt-learning from Cross-lingual Templates

Kunxun Qi, Hai Wan, Jianfeng Du, Haolan Chen


Abstract
Cross-lingual natural language inference (XNLI) is a fundamental task in cross-lingual natural language understanding. Recently this task is commonly addressed by pre-trained cross-lingual language models. Existing methods usually enhance pre-trained language models with additional data, such as annotated parallel corpora. These additional data, however, are rare in practice, especially for low-resource languages. Inspired by recent promising results achieved by prompt-learning, this paper proposes a novel prompt-learning based framework for enhancing XNLI. It reformulates the XNLI problem to a masked language modeling problem by constructing cloze-style questions through cross-lingual templates. To enforce correspondence between different languages, the framework augments a new question for every question using a sampled template in another language and then introduces a consistency loss to make the answer probability distribution obtained from the new question as similar as possible with the corresponding distribution obtained from the original question. Experimental results on two benchmark datasets demonstrate that XNLI models enhanced by our proposed framework significantly outperform original ones under both the full-shot and few-shot cross-lingual transfer settings.
Anthology ID:
2022.acl-long.134
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1910–1923
Language:
URL:
https://aclanthology.org/2022.acl-long.134
DOI:
10.18653/v1/2022.acl-long.134
Bibkey:
Cite (ACL):
Kunxun Qi, Hai Wan, Jianfeng Du, and Haolan Chen. 2022. Enhancing Cross-lingual Natural Language Inference by Prompt-learning from Cross-lingual Templates. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1910–1923, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Enhancing Cross-lingual Natural Language Inference by Prompt-learning from Cross-lingual Templates (Qi et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.134.pdf
Software:
 2022.acl-long.134.software.zip
Code
 qikunxun/pct
Data
PAWS-X