Machine Translation for a Very Low-Resource Language - Layer Freezing Approach on Transfer Learning

Amartya Chowdhury, Deepak K. T., Samudra Vijaya K, S. R. Mahadeva Prasanna


Abstract
This paper presents the implementation of Machine Translation (MT) between Lambani, a low-resource Indian tribal language, and English, a high-resource universal language. Lambani is spoken by nomadic tribes of the Indian state of Karnataka and there are similarities between Lambani and various other Indian languages. To implement the English-Lambani MT system, we followed the transfer learning approach with English-Kannada as the parent MT model. The implementation and performance of the English-Lambani MT system are discussed in this paper. Since Lambani has been influenced by various other languages, we explored the possibility of getting better MT performance by using parent models associated with related Indian languages. Specifically, we experimented with English-Gujarati and English-Marathi as additional parent models. We compare the performance of three different English-Lambani MT systems derived from three parent language models, and the observations are presented in the paper. Additionally, we will also explore the effect of freezing the encoder layer and decoder layer and the change in performance from both of them.
Anthology ID:
2022.loresmt-1.7
Volume:
Proceedings of the Fifth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2022)
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Atul Kr. Ojha, Chao-Hong Liu, Ekaterina Vylomova, Jade Abbott, Jonathan Washington, Nathaniel Oco, Tommi A Pirinen, Valentin Malykh, Varvara Logacheva, Xiaobing Zhao
Venue:
LoResMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
48–55
Language:
URL:
https://aclanthology.org/2022.loresmt-1.7
DOI:
Bibkey:
Cite (ACL):
Amartya Chowdhury, Deepak K. T., Samudra Vijaya K, and S. R. Mahadeva Prasanna. 2022. Machine Translation for a Very Low-Resource Language - Layer Freezing Approach on Transfer Learning. In Proceedings of the Fifth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2022), pages 48–55, Gyeongju, Republic of Korea. Association for Computational Linguistics.
Cite (Informal):
Machine Translation for a Very Low-Resource Language - Layer Freezing Approach on Transfer Learning (Chowdhury et al., LoResMT 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.loresmt-1.7.pdf