%0 Conference Proceedings %T A Closer Look at Few-Shot Crosslingual Transfer: The Choice of Shots Matters %A Zhao, Mengjie %A Zhu, Yi %A Shareghi, Ehsan %A Vulić, Ivan %A Reichart, Roi %A Korhonen, Anna %A Schütze, Hinrich %Y Zong, Chengqing %Y Xia, Fei %Y Li, Wenjie %Y Navigli, Roberto %S Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) %D 2021 %8 August %I Association for Computational Linguistics %C Online %F zhao-etal-2021-closer %X Few-shot crosslingual transfer has been shown to outperform its zero-shot counterpart with pretrained encoders like multilingual BERT. Despite its growing popularity, little to no attention has been paid to standardizing and analyzing the design of few-shot experiments. In this work, we highlight a fundamental risk posed by this shortcoming, illustrating that the model exhibits a high degree of sensitivity to the selection of few shots. We conduct a large-scale experimental study on 40 sets of sampled few shots for six diverse NLP tasks across up to 40 languages. We provide an analysis of success and failure cases of few-shot transfer, which highlights the role of lexical features. Additionally, we show that a straightforward full model finetuning approach is quite effective for few-shot transfer, outperforming several state-of-the-art few-shot approaches. As a step towards standardizing few-shot crosslingual experimental designs, we make our sampled few shots publicly available. %R 10.18653/v1/2021.acl-long.447 %U https://aclanthology.org/2021.acl-long.447 %U https://doi.org/10.18653/v1/2021.acl-long.447 %P 5751-5767