Reinforced Sequence Training based Subjective Bias Correction

Karthic Madanagopal, James Caverlee


Abstract
Subjective bias is ubiquitous on news sites, social media, and knowledge resources like Wikipedia. Many existing methods for subjective bias correction have typically focused on making one-word edits and have been trained over a single (often, noisy) domain. In contrast, we propose a novel reinforced sequence training approach for robust subjective bias correction. Three of the unique characteristics of the approach are: (i) it balances bias neutralization with fluency and semantics preservation through reinforcement learning, to broaden the scope to bias beyond a single word; (ii) it is cross-trained over multiple sources of bias to be more robust to new styles of biased writing that are not seen in the training data for a single domain; and (iii) it is used to fine-tune a large pre-trained transformer model to yield state-of-the-art performance in bias text correction task. Extensive experiments show that the proposed approach results in significant improvements in subjective bias correction versus alternatives.
Anthology ID:
2023.eacl-main.189
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2585–2598
Language:
URL:
https://aclanthology.org/2023.eacl-main.189
DOI:
10.18653/v1/2023.eacl-main.189
Bibkey:
Cite (ACL):
Karthic Madanagopal and James Caverlee. 2023. Reinforced Sequence Training based Subjective Bias Correction. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 2585–2598, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Reinforced Sequence Training based Subjective Bias Correction (Madanagopal & Caverlee, EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.189.pdf
Video:
 https://aclanthology.org/2023.eacl-main.189.mp4