Do Data-based Curricula Work?

Maxim Surkov, Vladislav Mosin, Ivan Yamshchikov


Abstract
Current state-of-the-art NLP systems use large neural networks that require extensive computational resources for training. Inspired by human knowledge acquisition, researchers have proposed curriculum learning - sequencing tasks (task-based curricula) or ordering and sampling the datasets (data-based curricula) that facilitate training. This work investigates the benefits of data-based curriculum learning for large language models such as BERT and T5. We experiment with various curricula based on complexity measures and different sampling strategies. Extensive experiments on several NLP tasks show that curricula based on various complexity measures rarely have any benefits, while random sampling performs either as well or better than curricula.
Anthology ID:
2022.insights-1.16
Volume:
Proceedings of the Third Workshop on Insights from Negative Results in NLP
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Shabnam Tafreshi, João Sedoc, Anna Rogers, Aleksandr Drozd, Anna Rumshisky, Arjun Akula
Venue:
insights
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
119–128
Language:
URL:
https://aclanthology.org/2022.insights-1.16
DOI:
10.18653/v1/2022.insights-1.16
Bibkey:
Cite (ACL):
Maxim Surkov, Vladislav Mosin, and Ivan Yamshchikov. 2022. Do Data-based Curricula Work?. In Proceedings of the Third Workshop on Insights from Negative Results in NLP, pages 119–128, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Do Data-based Curricula Work? (Surkov et al., insights 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.insights-1.16.pdf
Video:
 https://aclanthology.org/2022.insights-1.16.mp4