Don’t Go Far Off: An Empirical Study on Neural Poetry Translation

Tuhin Chakrabarty, Arkadiy Saakyan, Smaranda Muresan


Abstract
Despite constant improvements in machine translation quality, automatic poetry translation remains a challenging problem due to the lack of open-sourced parallel poetic corpora, and to the intrinsic complexities involved in preserving the semantics, style and figurative nature of poetry. We present an empirical investigation for poetry translation along several dimensions: 1) size and style of training data (poetic vs. non-poetic), including a zero-shot setup; 2) bilingual vs. multilingual learning; and 3) language-family-specific models vs. mixed-language-family models. To accomplish this, we contribute a parallel dataset of poetry translations for several language pairs. Our results show that multilingual fine-tuning on poetic text significantly outperforms multilingual fine-tuning on non-poetic text that is 35X larger in size, both in terms of automatic metrics (BLEU, BERTScore, COMET) and human evaluation metrics such as faithfulness (meaning and poetic style). Moreover, multilingual fine-tuning on poetic data outperforms bilingual fine-tuning on poetic data.
Anthology ID:
2021.emnlp-main.577
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7253–7265
Language:
URL:
https://aclanthology.org/2021.emnlp-main.577
DOI:
10.18653/v1/2021.emnlp-main.577
Bibkey:
Cite (ACL):
Tuhin Chakrabarty, Arkadiy Saakyan, and Smaranda Muresan. 2021. Don’t Go Far Off: An Empirical Study on Neural Poetry Translation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 7253–7265, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Don’t Go Far Off: An Empirical Study on Neural Poetry Translation (Chakrabarty et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.577.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.577.mp4