Paula Manzur


2021

bib
MT Human Evaluation – Insights & Approaches
Paula Manzur
Proceedings of Machine Translation Summit XVIII: Users and Providers Track

This session is designed to help companies and people in the business of translation evaluate MT output and to show how human translator feedback can be tweaked to make the process more objective and accurate. You will hear recommendations, insights, and takeaways on how to improve the procedure for human evaluation. When this is achieved, we can understand if the human eval study and machine metric result coheres. And we can think about what the future of translators looks like – the final “human touch” and automated MT review.”
Search
Co-authors
    Venues