Incorporating Subjectivity into Gendered Ambiguous Pronoun (GAP) Resolution using Style Transfer

Kartikey Pant, Tanvi Dadu


Abstract
The GAP dataset is a Wikipedia-based evaluation dataset for gender bias detection in coreference resolution, containing mostly objective sentences. Since subjectivity is ubiquitous in our daily texts, it becomes necessary to evaluate models for both subjective and objective instances. In this work, we present a new evaluation dataset for gender bias in coreference resolution, GAP-Subjective, which increases the coverage of the original GAP dataset by including subjective sentences. We outline the methodology used to create this dataset. Firstly, we detect objective sentences and transfer them into their subjective variants using a sequence-to-sequence model. Secondly, we outline the thresholding techniques based on fluency and content preservation to maintain the quality of the sentences. Thirdly, we perform automated and human-based analysis of the style transfer and infer that the transferred sentences are of high quality. Finally, we benchmark both GAP and GAP-Subjective datasets using a BERT-based model and analyze its predictive performance and gender bias.
Anthology ID:
2022.gebnlp-1.28
Volume:
Proceedings of the 4th Workshop on Gender Bias in Natural Language Processing (GeBNLP)
Month:
July
Year:
2022
Address:
Seattle, Washington
Venue:
GeBNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
273–281
Language:
URL:
https://aclanthology.org/2022.gebnlp-1.28
DOI:
10.18653/v1/2022.gebnlp-1.28
Bibkey:
Cite (ACL):
Kartikey Pant and Tanvi Dadu. 2022. Incorporating Subjectivity into Gendered Ambiguous Pronoun (GAP) Resolution using Style Transfer. In Proceedings of the 4th Workshop on Gender Bias in Natural Language Processing (GeBNLP), pages 273–281, Seattle, Washington. Association for Computational Linguistics.
Cite (Informal):
Incorporating Subjectivity into Gendered Ambiguous Pronoun (GAP) Resolution using Style Transfer (Pant & Dadu, GeBNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.gebnlp-1.28.pdf