%0 Conference Proceedings %T Improving Cross-lingual Text Classification with Zero-shot Instance-Weighting %A Li, Irene %A Sen, Prithviraj %A Zhu, Huaiyu %A Li, Yunyao %A Radev, Dragomir %Y Rogers, Anna %Y Calixto, Iacer %Y Vulić, Ivan %Y Saphra, Naomi %Y Kassner, Nora %Y Camburu, Oana-Maria %Y Bansal, Trapit %Y Shwartz, Vered %S Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021) %D 2021 %8 August %I Association for Computational Linguistics %C Online %F li-etal-2021-improving-cross %X Cross-lingual text classification (CLTC) is a challenging task made even harder still due to the lack of labeled data in low-resource languages. In this paper, we propose zero-shot instance-weighting, a general model-agnostic zero-shot learning framework for improving CLTC by leveraging source instance weighting. It adds a module on top of pre-trained language models for similarity computation of instance weights, thus aligning each source instance to the target language. During training, the framework utilizes gradient descent that is weighted by instance weights to update parameters. We evaluate this framework over seven target languages on three fundamental tasks and show its effectiveness and extensibility, by improving on F1 score up to 4% in single-source transfer and 8% in multi-source transfer. To the best of our knowledge, our method is the first to apply instance weighting in zero-shot CLTC. It is simple yet effective and easily extensible into multi-source transfer. %R 10.18653/v1/2021.repl4nlp-1.1 %U https://aclanthology.org/2021.repl4nlp-1.1 %U https://doi.org/10.18653/v1/2021.repl4nlp-1.1 %P 1-7