Data Pruning for Efficient Model Pruning in Neural Machine Translation

Abdul Azeemi, Ihsan Qazi, Agha Raza


Abstract
Model pruning methods reduce memory requirements and inference time of large-scale pre-trained language models after deployment. However, the actual pruning procedure is computationally intensive, involving repeated training and pruning until the required sparsity is achieved. This paper combines data pruning with movement pruning for Neural Machine Translation (NMT) to enable efficient fine-pruning. We design a dataset pruning strategy by leveraging cross-entropy scores of individual training instances. We conduct pruning experiments on the task of machine translation from Romanian-to-English and Turkish-to-English, and demonstrate that selecting hard-to-learn examples (top-k) based on training cross-entropy scores outperforms other dataset pruning methods. We empirically demonstrate that data pruning reduces the overall steps required for convergence and the training time of movement pruning. Finally, we perform a series of experiments to tease apart the role of training data during movement pruning and uncover new insights to understand the interplay between data and model pruning in the context of NMT.
Anthology ID:
2023.findings-emnlp.18
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
236–246
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.18
DOI:
10.18653/v1/2023.findings-emnlp.18
Bibkey:
Cite (ACL):
Abdul Azeemi, Ihsan Qazi, and Agha Raza. 2023. Data Pruning for Efficient Model Pruning in Neural Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 236–246, Singapore. Association for Computational Linguistics.
Cite (Informal):
Data Pruning for Efficient Model Pruning in Neural Machine Translation (Azeemi et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.18.pdf