Language Matters: Target-Language Supervision for Political Bias Detection in Turkish News

Umut Ozbagriacik, Haim Dubossarsky


Abstract
We present, to our knowledge, the first systematic transformer-based outlet-ideology classification study for Turkish news. Using a topic-balanced corpus of Turkish political articles drawn from six outlets commonly perceived as left-, centre-, or right-leaning, we formulate a three-way outlet-ideology classification task. On this dataset, we evaluate a monolingual encoder (BERTurk), two multilingual encoders (mBERT, XLM-R), and a LoRA-adapted decoder model (Mistral). BERTurk achieves the best performance among individual models (70% accuracy, 71% macro-F1), reaching levels comparable to English-language studies despite operating in a lower-resource setting. Error analyses show that all encoders reliably distinguish centrist from partisan articles, but frequently confuse left- and right-leaning articles with each other. Moreover, BERTurk is relatively stronger on right-leaning content, whereas the multilingual models favour left-leaning content, suggesting an “ideological fingerprint” of their pre-training data. Crucially, models fine-tuned on an English political-bias task fail to transfer to Turkish, collapsing to near-chance performance. Taken together, these results demonstrate that effective political bias detection requires target-language supervision and cannot be achieved through naïve cross-lingual transfer. Our work establishes a first baseline for Turkish political bias detection and underscores the need for open, carefully designed Turkish (and broader Turkic) bias benchmarks to support robust and fair media analysis.
Anthology ID:
2026.sigturk-1.7
Volume:
Proceedings of the Second Workshop Natural Language Processing for Turkic Languages (SIGTURK 2026)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Kemal Oflazer, Abdullatif Köksal, Onur Varol
Venues:
SIGTURK | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
72–81
Language:
URL:
https://aclanthology.org/2026.sigturk-1.7/
DOI:
Bibkey:
Cite (ACL):
Umut Ozbagriacik and Haim Dubossarsky. 2026. Language Matters: Target-Language Supervision for Political Bias Detection in Turkish News. In Proceedings of the Second Workshop Natural Language Processing for Turkic Languages (SIGTURK 2026), pages 72–81, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Language Matters: Target-Language Supervision for Political Bias Detection in Turkish News (Ozbagriacik & Dubossarsky, SIGTURK 2026)
Copy Citation:
PDF:
https://aclanthology.org/2026.sigturk-1.7.pdf