In-context Learning as Maintaining Coherency: A Study of On-the-fly Machine Translation Using Large Language Models

Suzanna Sia, Kevin Duh


Abstract
The phenomena of in-context learning has typically been thought of as “learning from examples”. In this work which focuses on Machine Translation, we present a perspective of in-context learning as the desired generation task maintaining coherency with its context, i.e., the prompt examples. We first investigate randomly sampled prompts across 4 domains, and find that translation performance improves when shown in-domain prompts. Next, we investigate coherency for the in-domain setting, which uses prompt examples from a moving window. We study this with respect to other factors that have previously been identified in the literature such as length, surface similarity and sentence embedding similarity. Our results across 3 models (GPTNeo2.7B, Bloom3B, XGLM2.9B), and three translation directions (en{pt, de, fr}) suggest that the long-term coherency of the prompts and the test sentence is a good indicator of downstream translation performance. In doing so, we demonstrate the efficacy of in-context Machine Translation for on-the-fly adaptation.
Anthology ID:
2023.mtsummit-research.15
Volume:
Proceedings of Machine Translation Summit XIX, Vol. 1: Research Track
Month:
September
Year:
2023
Address:
Macau SAR, China
Editors:
Masao Utiyama, Rui Wang
Venue:
MTSummit
SIG:
Publisher:
Asia-Pacific Association for Machine Translation
Note:
Pages:
173–185
Language:
URL:
https://aclanthology.org/2023.mtsummit-research.15
DOI:
Bibkey:
Cite (ACL):
Suzanna Sia and Kevin Duh. 2023. In-context Learning as Maintaining Coherency: A Study of On-the-fly Machine Translation Using Large Language Models. In Proceedings of Machine Translation Summit XIX, Vol. 1: Research Track, pages 173–185, Macau SAR, China. Asia-Pacific Association for Machine Translation.
Cite (Informal):
In-context Learning as Maintaining Coherency: A Study of On-the-fly Machine Translation Using Large Language Models (Sia & Duh, MTSummit 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.mtsummit-research.15.pdf