CoDa: Constrained Generation based Data Augmentation for Low-Resource NLP

Chandra Kiran Evuru, Sreyan Ghosh, Sonal Kumar, Ramaneswaran S, Utkarsh Tyagi, Dinesh Manocha


Abstract
We present CoDa (**Co**nstrained Generation based **Da**ta Augmentation), a controllable, effective, and *training-free* data augmentation technique for low-resource (data-scarce) NLP. Our approach is based on prompting off-the-shelf instruction-following Large Language Models (LLMs) for generating text that satisfies a set of constraints. Precisely, we extract a set of simple constraints from every instance in the low-resource dataset and verbalize them to prompt an LLM to generate novel and diverse training instances. Our findings reveal that synthetic data that follows simple constraints in the downstream dataset act as highly effective augmentations, and CoDa can achieve this without intricate decoding-time constrained generation techniques or fine-tuning with complex algorithms that eventually make the model biased toward the small number of training instances. Additionally, CoDa is the first framework that provides users explicit control over the augmentation generation process, thereby also allowing easy adaptation to several domains. We demonstrate the effectiveness of CoDa across 11 datasets spanning 3 tasks and 3 low-resource settings. CoDa outperforms all our baselines, qualitatively and quantitatively, with improvements of 0.12%-7.19%. Code is available.
Anthology ID:
2024.findings-naacl.238
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3754–3769
Language:
URL:
https://aclanthology.org/2024.findings-naacl.238
DOI:
10.18653/v1/2024.findings-naacl.238
Bibkey:
Cite (ACL):
Chandra Kiran Evuru, Sreyan Ghosh, Sonal Kumar, Ramaneswaran S, Utkarsh Tyagi, and Dinesh Manocha. 2024. CoDa: Constrained Generation based Data Augmentation for Low-Resource NLP. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 3754–3769, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
CoDa: Constrained Generation based Data Augmentation for Low-Resource NLP (Evuru et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.238.pdf