Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation Mitchell Gordon author Kevin Duh author 2020-07 text Proceedings of the Fourth Workshop on Neural Generation and Translation Alexandra Birch editor Andrew Finch editor Hiroaki Hayashi editor Kenneth Heafield editor Marcin Junczys-Dowmunt editor Ioannis Konstas editor Xian Li editor Graham Neubig editor Yusuke Oda editor Association for Computational Linguistics Online conference publication gordon-duh-2020-distill 10.18653/v1/2020.ngt-1.12 https://aclanthology.org/2020.ngt-1.12/ 2020-07 110 118