Unleashing the Multilingual Encoder Potential: Boosting Zero-Shot Performance via Probability Calibration

Ercong Nie, Helmut Schmid, Hinrich Schuetze


Abstract
Pretrained multilingual encoder models can directly perform zero-shot multilingual tasks or linguistic probing by reformulating the input examples into cloze-style prompts. This is accomplished by predicting the probabilities of the label words at the masked token position, without requiring any updates to the model parameters. However, the performance of this method is limited by the model’s bias toward predicting label words which frequently occurred during the pretraining. These words typically receive high probabilities. To address this issue, we combine the models with calibration techniques which modify the probabilities of label words predicted by the models. We first validate the effectiveness of a proposed simple calibration method together with other existing techniques on monolingual encoders in both zero- and few-shot scenarios. We subsequently employ these calibration techniques on multilingual encoders, resulting in substantial performance improvements across a wide range of tasks.
Anthology ID:
2023.findings-emnlp.1056
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15774–15782
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.1056
DOI:
10.18653/v1/2023.findings-emnlp.1056
Bibkey:
Cite (ACL):
Ercong Nie, Helmut Schmid, and Hinrich Schuetze. 2023. Unleashing the Multilingual Encoder Potential: Boosting Zero-Shot Performance via Probability Calibration. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 15774–15782, Singapore. Association for Computational Linguistics.
Cite (Informal):
Unleashing the Multilingual Encoder Potential: Boosting Zero-Shot Performance via Probability Calibration (Nie et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.1056.pdf