Reinforcement Learning with Large Action Spaces for Neural Machine Translation

Asaf Yehudai, Leshem Choshen, Lior Fox, Omri Abend


Abstract
Applying Reinforcement learning (RL) following maximum likelihood estimation (MLE) pre-training is a versatile method for enhancing neural machine translation (NMT) performance. However, recent work has argued that the gains produced by RL for NMT are mostly due to promoting tokens that have already received a fairly high probability in pre-training. We hypothesize that the large action space is a main obstacle to RL’s effectiveness in MT, and conduct two sets of experiments that lend support to our hypothesis. First, we find that reducing the size of the vocabulary improves RL’s effectiveness. Second, we find that effectively reducing the dimension of the action space without changing the vocabulary also yields notable improvement as evaluated by BLEU, semantic similarity, and human evaluation. Indeed, by initializing the network’s final fully connected layer (that maps the network’s internal dimension to the vocabulary dimension), with a layer that generalizes over similar actions, we obtain a substantial improvement in RL performance: 1.5 BLEU points on average.
Anthology ID:
2022.coling-1.401
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4544–4556
Language:
URL:
https://aclanthology.org/2022.coling-1.401
DOI:
Bibkey:
Cite (ACL):
Asaf Yehudai, Leshem Choshen, Lior Fox, and Omri Abend. 2022. Reinforcement Learning with Large Action Spaces for Neural Machine Translation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 4544–4556, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Reinforcement Learning with Large Action Spaces for Neural Machine Translation (Yehudai et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.401.pdf