Brett Fazio
2021
Morphologically-Guided Segmentation For Translation of Agglutinative Low-Resource Languages
William Chen
|
Brett Fazio
Proceedings of the 4th Workshop on Technologies for MT of Low Resource Languages (LoResMT2021)
Neural Machine Translation (NMT) for Low Resource Languages (LRL) is often limited by the lack of available training data, making it necessary to explore additional techniques to improve translation quality. We propose the use of the Prefix-Root-Postfix-Encoding (PRPE) subword segmentation algorithm to improve translation quality for LRLs, using two agglutinative languages as case studies: Quechua and Indonesian. During the course of our experiments, we reintroduce a parallel corpus for Quechua-Spanish translation that was previously unavailable for NMT. Our experiments show the importance of appropriate subword segmentation, which can go as far as improving translation quality over systems trained on much larger quantities of data. We show this by achieving state-of-the-art results for both languages, obtaining higher BLEU scores than large pre-trained models with much smaller amounts of data.
The UCF Systems for the LoResMT 2021 Machine Translation Shared Task
William Chen
|
Brett Fazio
Proceedings of the 4th Workshop on Technologies for MT of Low Resource Languages (LoResMT2021)
We present the University of Central Florida systems for the LoResMT 2021 Shared Task, participating in the English-Irish and English-Marathi translation pairs. We focused our efforts on constrained track of the task, using transfer learning and subword segmentation to enhance our models given small amounts of training data. Our models achieved the highest BLEU scores on the fully constrained tracks of English-Irish, Irish-English, and Marathi-English with scores of 13.5, 21.3, and 17.9 respectively
Search