Working with Pre-translated Texts: Preliminary Findings from a Survey on Post-editing and Revision Practices in Swiss Corporate In-house Language Services

Sabrina Girletti


Abstract
With the arrival of neural machine translation, the boundaries between revision and post-editing (PE) have started to blur (Koponen et al., 2020). To shed light on current professional practices and provide new pedagogical perspectives, we set up a survey-based study to investigate how PE and revision are carried out in professional settings. We received 86 responses from corporate translators working at 26 different corporate in-house language services in Switzerland. Although the differences between the two activities seem to be clear for in-house linguists, our findings show that they tend to use the same reading strategies when working with human-translated and machine-translated texts.
Anthology ID:
2022.eamt-1.30
Volume:
Proceedings of the 23rd Annual Conference of the European Association for Machine Translation
Month:
June
Year:
2022
Address:
Ghent, Belgium
Editors:
Helena Moniz, Lieve Macken, Andrew Rufener, Loïc Barrault, Marta R. Costa-jussà, Christophe Declercq, Maarit Koponen, Ellie Kemp, Spyridon Pilos, Mikel L. Forcada, Carolina Scarton, Joachim Van den Bogaert, Joke Daems, Arda Tezcan, Bram Vanroy, Margot Fonteyne
Venue:
EAMT
SIG:
Publisher:
European Association for Machine Translation
Note:
Pages:
271–280
Language:
URL:
https://aclanthology.org/2022.eamt-1.30
DOI:
Bibkey:
Cite (ACL):
Sabrina Girletti. 2022. Working with Pre-translated Texts: Preliminary Findings from a Survey on Post-editing and Revision Practices in Swiss Corporate In-house Language Services. In Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, pages 271–280, Ghent, Belgium. European Association for Machine Translation.
Cite (Informal):
Working with Pre-translated Texts: Preliminary Findings from a Survey on Post-editing and Revision Practices in Swiss Corporate In-house Language Services (Girletti, EAMT 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.eamt-1.30.pdf