UniGen: Universal Domain Generalization for Sentiment Classification via Zero-shot Dataset Generation

Juhwan Choi, Yeonghwa Kim, Seunguk Yu, JungMin Yun, YoungBin Kim


Abstract
Although pre-trained language models have exhibited great flexibility and versatility with prompt-based few-shot learning, they suffer from the extensive parameter size and limited applicability for inference. Recent studies have suggested that PLMs be used as dataset generators and a tiny task-specific model be trained to achieve efficient inference. However, their applicability to various domains is limited because they tend to generate domain-specific datasets. In this work, we propose a novel approach to universal domain generalization that generates a dataset regardless of the target domain. This allows for generalization of the tiny task model to any domain that shares the label space, thus enhancing the real-world applicability of the dataset generation paradigm. Our experiments indicate that the proposed method accomplishes generalizability across various domains while using a parameter set that is orders of magnitude smaller than PLMs.
Anthology ID:
2024.emnlp-main.1
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–14
Language:
URL:
https://aclanthology.org/2024.emnlp-main.1
DOI:
Bibkey:
Cite (ACL):
Juhwan Choi, Yeonghwa Kim, Seunguk Yu, JungMin Yun, and YoungBin Kim. 2024. UniGen: Universal Domain Generalization for Sentiment Classification via Zero-shot Dataset Generation. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 1–14, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
UniGen: Universal Domain Generalization for Sentiment Classification via Zero-shot Dataset Generation (Choi et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.1.pdf
Software:
 2024.emnlp-main.1.software.zip