An Active Learning Framework for Inclusive Generation by Large Language Models

Sabit Hassan, Anthony B. Sicilia, Malihe Alikhani


Abstract
Ensuring that Large Language Models (LLMs) generate text representative of diverse sub-populations is essential, particularly when key concepts related to under-represented groups are scarce in the training data. We address this challenge with a novel clustering-based active learning framework, enhanced with knowledge distillation. The proposed framework transforms the intermediate outputs of the learner model, enabling effective active learning for generative tasks for the first time. Integration of clustering and knowledge distillation yields more representative models without prior knowledge of underlying data distribution and overbearing human efforts. We validate our approach in practice through case studies in counter-narration and style transfer. We construct two new datasets in tandem with model training, showing a performance improvement of 2%–10% over baseline models. Our results also show more consistent performance across various data subgroups and increased lexical diversity, underscoring our model’s resilience to skewness in available data. Further, our results show that the data acquired via our approach improves the performance of secondary models not involved in the learning loop, showcasing practical utility of the framework.
Anthology ID:
2025.coling-main.362
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5403–5414
Language:
URL:
https://aclanthology.org/2025.coling-main.362/
DOI:
Bibkey:
Cite (ACL):
Sabit Hassan, Anthony B. Sicilia, and Malihe Alikhani. 2025. An Active Learning Framework for Inclusive Generation by Large Language Models. In Proceedings of the 31st International Conference on Computational Linguistics, pages 5403–5414, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
An Active Learning Framework for Inclusive Generation by Large Language Models (Hassan et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.362.pdf