%0 Conference Proceedings %T Multiple Pivot Languages and Strategic Decoder Initialization Helps Neural Machine Translation %A Mhaskar, Shivam %A Bhattacharyya, Pushpak %Y Ojha, Atul Kr. %Y Liu, Chao-Hong %Y Vylomova, Ekaterina %Y Abbott, Jade %Y Washington, Jonathan %Y Oco, Nathaniel %Y Pirinen, Tommi A. %Y Malykh, Valentin %Y Logacheva, Varvara %Y Zhao, Xiaobing %S Proceedings of the Fifth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2022) %D 2022 %8 October %I Association for Computational Linguistics %C Gyeongju, Republic of Korea %F mhaskar-bhattacharyya-2022-multiple %X In machine translation, a pivot language can be used to assist the source to target translation model. In pivot-based transfer learning, the source to pivot and the pivot to target models are used to improve the performance of the source to target model. This technique works best when both source-pivot and pivot-target are high resource language pairs and the source-target is a low resource language pair. But in some cases, such as Indic languages, the pivot to target language pair is not a high resource one. To overcome this limitation, we use multiple related languages as pivot languages to assist the source to target model. We show that using multiple pivot languages gives 2.03 BLEU and 3.05 chrF score improvement over the baseline model. We show that strategic decoder initialization while performing pivot-based transfer learning with multiple pivot languages gives a 3.67 BLEU and 5.94 chrF score improvement over the baseline model. %U https://aclanthology.org/2022.loresmt-1.2 %P 9-14