%0 Conference Proceedings %T Language Model Pretraining and Transfer Learning for Very Low Resource Languages %A Khatri, Jyotsana %A Murthy, Rudra %A Bhattacharyya, Pushpak %Y Barrault, Loic %Y Bojar, Ondrej %Y Bougares, Fethi %Y Chatterjee, Rajen %Y Costa-jussa, Marta R. %Y Federmann, Christian %Y Fishel, Mark %Y Fraser, Alexander %Y Freitag, Markus %Y Graham, Yvette %Y Grundkiewicz, Roman %Y Guzman, Paco %Y Haddow, Barry %Y Huck, Matthias %Y Yepes, Antonio Jimeno %Y Koehn, Philipp %Y Kocmi, Tom %Y Martins, Andre %Y Morishita, Makoto %Y Monz, Christof %S Proceedings of the Sixth Conference on Machine Translation %D 2021 %8 November %I Association for Computational Linguistics %C Online %F khatri-etal-2021-language-model %X This paper describes our submission for the shared task on Unsupervised MT and Very Low Resource Supervised MT at WMT 2021. We submitted systems for two language pairs: German ↔ Upper Sorbian (de ↔ hsb) and German-Lower Sorbian (de ↔ dsb). For de ↔ hsb, we pretrain our system using MASS (Masked Sequence to Sequence) objective and then finetune using iterative back-translation. Final finetunng is performed using the parallel data provided for translation objective. For de ↔ dsb, no parallel data is provided in the task, we use final de ↔ hsb model as initialization of the de ↔ dsb model and train it further using iterative back-translation, using the same vocabulary as used in the de ↔ hsb model. %U https://aclanthology.org/2021.wmt-1.106 %P 995-998