Multi-News+: Cost-efficient Dataset Cleansing via LLM-based Data Annotation

Juhwan Choi, JungMin Yun, Kyohoon Jin, YoungBin Kim


Abstract
The quality of the dataset is crucial for ensuring optimal performance and reliability of downstream task models. However, datasets often contain noisy data inadvertently included during the construction process. Numerous attempts have been made to correct this issue through human annotators. However, hiring and managing human annotators is expensive and time-consuming. As an alternative, recent studies are exploring the use of large language models (LLMs) for data annotation.In this study, we present a case study that extends the application of LLM-based data annotation to enhance the quality of existing datasets through a cleansing strategy. Specifically, we leverage approaches such as chain-of-thought and majority voting to imitate human annotation and classify unrelated documents from the Multi-News dataset, which is widely used for the multi-document summarization task. Through our proposed cleansing method, we introduce an enhanced Multi-News+. By employing LLMs for data cleansing, we demonstrate an efficient and effective approach to improving dataset quality without relying on expensive human annotation efforts.
Anthology ID:
2024.emnlp-main.2
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15–29
Language:
URL:
https://aclanthology.org/2024.emnlp-main.2
DOI:
Bibkey:
Cite (ACL):
Juhwan Choi, JungMin Yun, Kyohoon Jin, and YoungBin Kim. 2024. Multi-News+: Cost-efficient Dataset Cleansing via LLM-based Data Annotation. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 15–29, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Multi-News+: Cost-efficient Dataset Cleansing via LLM-based Data Annotation (Choi et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.2.pdf