Qianqian Qiao


2019

pdf bib
Transfer Learning from Pre-trained BERT for Pronoun Resolution
Xingce Bao | Qianqian Qiao
Proceedings of the First Workshop on Gender Bias in Natural Language Processing

The paper describes the submission of the team “We used bert!” to the shared task Gendered Pronoun Resolution (Pair pronouns to their correct entities). Our final submission model based on the fine-tuned BERT (Bidirectional Encoder Representations from Transformers) ranks 14th among 838 teams with a multi-class logarithmic loss of 0.208. In this work, contribution of transfer learning technique to pronoun resolution systems is investigated and the gender bias contained in classification models is evaluated.
Search
Co-authors
Venues