Supervised Contrastive Learning for Cross-lingual Transfer Learning

Wang Shuaibo, Di Hui, Huang Hui, Lai Siyu, Ouchi Kazushige, Chen Yufeng, Xu Jinan


Abstract
“Multilingual pre-trained representations are not well-aligned by nature, which harms their performance on cross-lingual tasks. Previous methods propose to post-align the multilingual pretrained representations by multi-view alignment or contrastive learning. However, we argue that both methods are not suitable for the cross-lingual classification objective, and in this paper we propose a simple yet effective method to better align the pre-trained representations. On the basis of cross-lingual data augmentations, we make a minor modification to the canonical contrastive loss, to remove false-negative examples which should not be contrasted. Augmentations with the same class are brought close to the anchor sample, and augmentations with different class are pushed apart. Experiment results on three cross-lingual tasks from XTREME benchmark show our method could improve the transfer performance by a large margin with no additional resource needed. We also provide in-detail analysis and comparison between different post-alignment strategies.”
Anthology ID:
2022.ccl-1.78
Volume:
Proceedings of the 21st Chinese National Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Nanchang, China
Editors:
Maosong Sun (孙茂松), Yang Liu (刘洋), Wanxiang Che (车万翔), Yang Feng (冯洋), Xipeng Qiu (邱锡鹏), Gaoqi Rao (饶高琦), Yubo Chen (陈玉博)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
884–895
Language:
English
URL:
https://aclanthology.org/2022.ccl-1.78
DOI:
Bibkey:
Cite (ACL):
Wang Shuaibo, Di Hui, Huang Hui, Lai Siyu, Ouchi Kazushige, Chen Yufeng, and Xu Jinan. 2022. Supervised Contrastive Learning for Cross-lingual Transfer Learning. In Proceedings of the 21st Chinese National Conference on Computational Linguistics, pages 884–895, Nanchang, China. Chinese Information Processing Society of China.
Cite (Informal):
Supervised Contrastive Learning for Cross-lingual Transfer Learning (Shuaibo et al., CCL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.ccl-1.78.pdf
Data
PAWS-XXNLI