TTUI at SemEval-2020 Task 11: Propaganda Detection with Transfer Learning and Ensembles

Moonsung Kim, Steven Bethard


Abstract
In this paper, we describe our approaches and systems for the SemEval-2020 Task 11 on propaganda technique detection. We fine-tuned BERT and RoBERTa pre-trained models then merged them with an average ensemble. We conducted several experiments for input representations dealing with long texts and preserving context as well as for the imbalanced class problem. Our system ranked 20th out of 36 teams with 0.398 F1 in the SI task and 14th out of 31 teams with 0.556 F1 in the TC task.
Anthology ID:
2020.semeval-1.240
Volume:
Proceedings of the Fourteenth Workshop on Semantic Evaluation
Month:
December
Year:
2020
Address:
Barcelona (online)
Editors:
Aurelie Herbelot, Xiaodan Zhu, Alexis Palmer, Nathan Schneider, Jonathan May, Ekaterina Shutova
Venue:
SemEval
SIG:
SIGLEX
Publisher:
International Committee for Computational Linguistics
Note:
Pages:
1829–1834
Language:
URL:
https://aclanthology.org/2020.semeval-1.240
DOI:
10.18653/v1/2020.semeval-1.240
Bibkey:
Cite (ACL):
Moonsung Kim and Steven Bethard. 2020. TTUI at SemEval-2020 Task 11: Propaganda Detection with Transfer Learning and Ensembles. In Proceedings of the Fourteenth Workshop on Semantic Evaluation, pages 1829–1834, Barcelona (online). International Committee for Computational Linguistics.
Cite (Informal):
TTUI at SemEval-2020 Task 11: Propaganda Detection with Transfer Learning and Ensembles (Kim & Bethard, SemEval 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.semeval-1.240.pdf