DataDreamer: A Tool for Synthetic Data Generation and Reproducible LLM Workflows

Ajay Patel, Colin Raffel, Chris Callison-Burch


Abstract
Large language models (LLMs) have become a dominant and important tool for NLP researchers in a wide range of tasks. Today, many researchers use LLMs in synthetic data generation, task evaluation, fine-tuning, distillation, and other model-in-the-loop research workflows. However, challenges arise when using these models that stem from their scale, their closed source nature, and the lack of standardized tooling for these new and emerging workflows. The rapid rise to prominence of these models and these unique challenges has had immediate adverse impacts on open science and on the reproducibility of work that uses them. In this ACL 2024 theme track paper, we introduce DataDreamer, an open source Python library that allows researchers to write simple code to implement powerful LLM workflows. DataDreamer also helps researchers adhere to best practices that we propose to encourage open science and reproducibility. The library and documentation are available at: https://github.com/datadreamer-dev/DataDreamer.
Anthology ID:
2024.acl-long.208
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3781–3799
Language:
URL:
https://aclanthology.org/2024.acl-long.208
DOI:
Bibkey:
Cite (ACL):
Ajay Patel, Colin Raffel, and Chris Callison-Burch. 2024. DataDreamer: A Tool for Synthetic Data Generation and Reproducible LLM Workflows. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3781–3799, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
DataDreamer: A Tool for Synthetic Data Generation and Reproducible LLM Workflows (Patel et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.208.pdf