The ‘purest’ EBMT System Ever Built: No Variables, No Templates, No Training, Examples, Just Examples, Only Examples

Yves Lepage, Etienne Denoual


Abstract
We designed, implemented and assessed an EBMT system that can be dubbed the “purest ever built”: it strictly does not make any use of variables, templates or training, does not have any explicit transfer component, and does not require any preprocessing of the aligned examples. It uses a specific operation, namely proportional analogy, that implicitly neutralises divergences between languages and captures lexical and syntactical variations along the paradigmatic and syntagmatic axes without explicitly decomposing sentences into fragments. In an experiment with a test set of 510 input sentences and an unprocessed corpus of almost 160,000 aligned sentences in Japanese and English, we obtained BLEU, NIST and mWER scores of 0.53, 8.53 and 0.39 respectively, well above a baseline simulating a translation memory.
Anthology ID:
2005.mtsummit-ebmt.11
Volume:
Workshop on example-based machine translation
Month:
September 13-15
Year:
2005
Address:
Phuket, Thailand
Venue:
MTSummit
SIG:
Publisher:
Note:
Pages:
81–90
Language:
URL:
https://aclanthology.org/2005.mtsummit-ebmt.11
DOI:
Bibkey:
Cite (ACL):
Yves Lepage and Etienne Denoual. 2005. The ‘purest’ EBMT System Ever Built: No Variables, No Templates, No Training, Examples, Just Examples, Only Examples. In Workshop on example-based machine translation, pages 81–90, Phuket, Thailand.
Cite (Informal):
The ‘purest’ EBMT System Ever Built: No Variables, No Templates, No Training, Examples, Just Examples, Only Examples (Lepage & Denoual, MTSummit 2005)
Copy Citation:
PDF:
https://aclanthology.org/2005.mtsummit-ebmt.11.pdf