%0 Conference Proceedings %T Unsupervised Cross-Lingual Transfer of Structured Predictors without Source Data %A Kurniawan, Kemal %A Frermann, Lea %A Schulz, Philip %A Cohn, Trevor %Y Carpuat, Marine %Y de Marneffe, Marie-Catherine %Y Meza Ruiz, Ivan Vladimir %S Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies %D 2022 %8 July %I Association for Computational Linguistics %C Seattle, United States %F kurniawan-etal-2022-unsupervised %X Providing technologies to communities or domains where training data is scarce or protected e.g., for privacy reasons, is becoming increasingly important. To that end, we generalise methods for unsupervised transfer from multiple input models for structured prediction. We show that the means of aggregating over the input models is critical, and that multiplying marginal probabilities of substructures to obtain high-probability structures for distant supervision is substantially better than taking the union of such structures over the input models, as done in prior work. Testing on 18 languages, we demonstrate that the method works in a cross-lingual setting, considering both dependency parsing and part-of-speech structured prediction problems. Our analyses show that the proposed method produces less noisy labels for the distant supervision. %R 10.18653/v1/2022.naacl-main.149 %U https://aclanthology.org/2022.naacl-main.149 %U https://doi.org/10.18653/v1/2022.naacl-main.149 %P 2041-2054