%0 Conference Proceedings %T Bridging Philippine Languages With Multilingual Neural Machine Translation %A Baliber, Renz Iver %A Cheng, Charibeth %A Adlaon, Kristine Mae %A Mamonong, Virgion %Y Karakanta, Alina %Y Ojha, Atul Kr. %Y Liu, Chao-Hong %Y Abbott, Jade %Y Ortega, John %Y Washington, Jonathan %Y Oco, Nathaniel %Y Lakew, Surafel Melaku %Y Pirinen, Tommi A. %Y Malykh, Valentin %Y Logacheva, Varvara %Y Zhao, Xiaobing %S Proceedings of the 3rd Workshop on Technologies for MT of Low Resource Languages %D 2020 %8 December %I Association for Computational Linguistics %C Suzhou, China %F baliber-etal-2020-bridging %X The Philippines is home to more than 150 languages that is considered to be low-resourced even on its major languages. This results into a lack of pursuit in developing a translation system for the underrepresented languages. To simplify the process of developing translation system for multiple languages, and to aid in improving the translation quality of zero to low-resource languages, multilingual NMT became an active area of research. However, existing works in multilingual NMT disregards the analysis of a multilingual model on a closely related and low-resource language group in the context of pivot-based translation and zero-shot translation. In this paper, we benchmarked translation for several Philippine Languages, provided an analysis of a multilingual NMT system for morphologically rich and low-resource languages in terms of its effectiveness in translating zero-resource languages with zero-shot translations. To further evaluate the capability of the multilingual NMT model in translating unseen language pairs in training, we tested the model to translate between Tagalog and Cebuano and compared its performance with a simple NMT model that is directly trained on a parallel Tagalog and Cebuano data in which we showed that zero-shot translation outperforms a directly trained model in some instances, while utilizing English as a pivot language in translating outperform both approaches. %U https://aclanthology.org/2020.loresmt-1.2 %P 14-22