Efficient Transfer Learning for Quality Estimation with Bottleneck Adapter Layer

Hao Yang, Minghan Wang, Ning Xie, Ying Qin, Yao Deng


Abstract
The Predictor-Estimator framework for quality estimation (QE) is commonly used for its strong performance. Where the predictor and estimator works on feature extraction and quality evaluation, respectively. However, training the predictor from scratch is computationally expensive. In this paper, we propose an efficient transfer learning framework to transfer knowledge from NMT dataset into QE models. A Predictor-Estimator alike model named BAL-QE is also proposed, aiming to extract high quality features with pre-trained NMT model, and make classification with a fine-tuned Bottleneck Adapter Layer (BAL). The experiment shows that BAL-QE achieves 97% of the SOTA performance in WMT19 En-De and En-Ru QE tasks by only training 3% of parameters within 4 hours on 4 Titan XP GPUs. Compared with the commonly used NuQE baseline, BAL-QE achieves 47% (En-Ru) and 75% (En-De) of performance promotions.
Anthology ID:
2020.eamt-1.4
Volume:
Proceedings of the 22nd Annual Conference of the European Association for Machine Translation
Month:
November
Year:
2020
Address:
Lisboa, Portugal
Editors:
André Martins, Helena Moniz, Sara Fumega, Bruno Martins, Fernando Batista, Luisa Coheur, Carla Parra, Isabel Trancoso, Marco Turchi, Arianna Bisazza, Joss Moorkens, Ana Guerberof, Mary Nurminen, Lena Marg, Mikel L. Forcada
Venue:
EAMT
SIG:
Publisher:
European Association for Machine Translation
Note:
Pages:
29–34
Language:
URL:
https://aclanthology.org/2020.eamt-1.4
DOI:
Bibkey:
Cite (ACL):
Hao Yang, Minghan Wang, Ning Xie, Ying Qin, and Yao Deng. 2020. Efficient Transfer Learning for Quality Estimation with Bottleneck Adapter Layer. In Proceedings of the 22nd Annual Conference of the European Association for Machine Translation, pages 29–34, Lisboa, Portugal. European Association for Machine Translation.
Cite (Informal):
Efficient Transfer Learning for Quality Estimation with Bottleneck Adapter Layer (Yang et al., EAMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.eamt-1.4.pdf