Controlling Out-of-Domain Gaps in LLMs for Genre Classification and Generated Text Detection

Dmitri Roussinov, Serge Sharoff, Nadezhda Puchnina


Abstract
This study demonstrates that the modern generation of Large Language Models (LLMs, such as GPT-4) suffers from the same out-of-domain (OOD) performance gap observed in prior research on pre-trained Language Models (PLMs, such as BERT). We demonstrate this across two non-topical classification tasks: (1) genre classification and (2) generated text detection. Our results show that when demonstration examples for In-Context Learning (ICL) come from one domain (e.g., travel) and the system is tested on another domain (e.g., history), classification performance declines significantly. To address this, we introduce a method that controls which predictive indicators are used and which are excluded during classification. For the two tasks studied here, this ensures that topical features are omitted, while the model is guided to focus on stylistic rather than content-based attributes. This approach reduces the OOD gap by up to 20 percentage points in a few-shot setup. Straightforward Chain-of-Thought (CoT) methods, used as the baseline, prove insufficient, while our approach consistently enhances domain transfer performance.
Anthology ID:
2025.coling-main.224
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3329–3344
Language:
URL:
https://aclanthology.org/2025.coling-main.224/
DOI:
Bibkey:
Cite (ACL):
Dmitri Roussinov, Serge Sharoff, and Nadezhda Puchnina. 2025. Controlling Out-of-Domain Gaps in LLMs for Genre Classification and Generated Text Detection. In Proceedings of the 31st International Conference on Computational Linguistics, pages 3329–3344, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Controlling Out-of-Domain Gaps in LLMs for Genre Classification and Generated Text Detection (Roussinov et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.224.pdf